50+ Apache Kafka Jobs in India
Apply to 50+ Apache Kafka Jobs on CutShort.io. Find your next job, effortlessly. Browse Apache Kafka Jobs and apply today!

About Us:
Tradelab Technologies Pvt Ltd is not for those seeking comfort—we are for those hungry to make a mark in the trading and fintech industry. If you are looking for just another backend role, this isn’t it. We want risk-takers, relentless learners, and those who find joy in pushing their limits
every day. If you thrive in high-stakes environments and have a deep passion for performance driven backend systems, we want you.
What You Will Do:
• Develop and optimize high-performance backend systems in Golang for trading platforms and financial
services.
• Architect low-latency, high-throughput microservices that push the boundaries of speed and efficiency.
• Build event-driven, fault-tolerant systems that can handle massive real-time data streams.
• Own your work—no babysitting, no micromanagement.
• Work alongside equally driven engineers who expect nothing less than brilliance.
• Learn faster than you ever thought possible.
Must-Have Skills:
• Proven expertise in Golang (if you need to prove yourself, this isn’t the role for you).
• Deep understanding of concurrency, memory management, and system design.
• Experience with Trading, market data processing, or low-latency systems.
• Strong knowledge of distributed systems, message queues (Kafka, RabbitMQ), and real-time processing.
• Hands-on with Docker, Kubernetes, and CI/CD pipelines.
• A portfolio of work that speaks louder than a resume.
Nice-to-Have Skills:
• Past experience in fintech, trading systems, or algorithmic trading.
• Contributions to open-source Golang projects.
• A history of building something impactful from scratch.
• Understanding of FIX protocol, WebSockets, and streaming APIs.


About Moative
Moative, an Applied AI company, designs and builds transformation AI solutions for traditional industries in energy, utilities, healthcare & lifesciences, and more. Through Moative Labs, we build AI micro-products and launch AI startups with partners in vertical markets that align with our theses.
Our Past: We have built and sold two companies, one of which was an AI company. Our founders and leaders are Math PhDs, Ivy League University Alumni, Ex-Googlers, and successful entrepreneurs.
Our Team: Our team of 20+ employees consist of data scientists, AI/ML Engineers, and mathematicians from top engineering and research institutes such as IITs, CERN, IISc, UZH, Ph.Ds. Our team includes academicians, IBM Research Fellows, and former founders.
Work you’ll do
As a Data Engineer, you will work on data architecture, large-scale processing systems, and data flow management. You will build and maintain optimal data architecture and data pipelines, assemble large, complex data sets, and ensure that data is readily available to data scientists, analysts, and other users. In close collaboration with ML engineers, data scientists, and domain experts, you’ll deliver robust, production-grade solutions that directly impact business outcomes. Ultimately, you will be responsible for developing and implementing systems that optimize the organization’s data use and data quality.
Responsibilities
- Create and maintain optimal data architecture and data pipelines on cloud infrastructure (such as AWS/ Azure/ GCP)
- Assemble large, complex data sets that meet functional / non-functional business requirements
- Identify, design, and implement internal process improvements
- Build the pipeline infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
- Support development of analytics that utilize the data pipeline to provide actionable insights into key business metrics
- Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs
Who you are
You are a passionate and results-oriented engineer who understands the importance of data architecture and data quality to impact solution development, enhance products, and ultimately improve business applications. You thrive in dynamic environments and are comfortable navigating ambiguity. You possess a strong sense of ownership and are eager to take initiative, advocating for your technical decisions while remaining open to feedback and collaboration.
You have experience in developing and deploying data pipelines to support real-world applications. You have a good understanding of data structures and are excellent at writing clean, efficient code to extract, create and manage large data sets for analytical uses. You have the ability to conduct regular testing and debugging to ensure optimal data pipeline performance. You are excited at the possibility of contributing to intelligent applications that can directly impact business services and make a positive difference to users.
Skills & Requirements
- 3+ years of hands-on experience as a data engineer, data architect or similar role, with a good understanding of data structures and data engineering.
- Solid knowledge of cloud infra and data-related services on AWS (EC2, EMR, RDS, Redshift) and/ or Azure.
- Advanced knowledge of SQL, including writing complex queries, stored procedures, views, etc.
- Strong experience with data pipeline and workflow management tools (such as Luigi, Airflow).
- Experience with common relational SQL, NoSQL and Graph databases.
- Strong experience with scripting languages: Python, PySpark, Scala, etc.
- Practical experience with basic DevOps concepts: CI/CD, containerization (Docker, Kubernetes), etc
- Experience with big data tools (Spark, Kafka, etc) and stream processing.
- Excellent communication skills to collaborate with colleagues from both technical and business backgrounds, discuss and convey ideas and findings effectively.
- Ability to analyze complex problems, think critically for troubleshooting and develop robust data solutions.
- Ability to identify and tackle issues efficiently and proactively, conduct thorough research and collaborate to find long-term, scalable solutions.
Working at Moative
Moative is a young company, but we believe strongly in thinking long-term, while acting with urgency. Our ethos is rooted in innovation, efficiency and high-quality outcomes. We believe the future of work is AI-augmented and boundary less. Here are some of our guiding principles:
- Think in decades. Act in hours. As an independent company, our moat is time. While our decisions are for the long-term horizon, our execution will be fast – measured in hours and days, not weeks and months.
- Own the canvas. Throw yourself in to build, fix or improve – anything that isn’t done right, irrespective of who did it. Be selfish about improving across the organization – because once the rot sets in, we waste years in surgery and recovery.
- Use data or don’t use data. Use data where you ought to but not as a ‘cover-my-back’ political tool. Be capable of making decisions with partial or limited data. Get better at intuition and pattern-matching. Whichever way you go, be mostly right about it.
- Avoid work about work. Process creeps on purpose, unless we constantly question it. We are deliberate about committing to rituals that take time away from the actual work. We truly believe that a meeting that could be an email, should be an email and you don’t need a person with the highest title to say that out loud.
- High revenue per person. We work backwards from this metric. Our default is to automate instead of hiring. We multi-skill our people to own more outcomes than hiring someone who has less to do. We don’t like squatting and hoarding that comes in the form of hiring for growth. High revenue per person comes from high quality work from everyone. We demand it.
If this role and our work is of interest to you, please apply. We encourage you to apply even if you believe you do not meet all the requirements listed above.
That said, you should demonstrate that you are in the 90th percentile or above. This may mean that you have studied in top-notch institutions, won competitions that are intellectually demanding, built something of your own, or rated as an outstanding performer by your current or previous employers.
The position is based out of Chennai. Our work currently involves significant in-person collaboration and we expect you to work out of our offices in Chennai.

We are looking for a highly skilled Sr. Big Data Engineer with 3-5 years of experience in
building large-scale data pipelines, real-time streaming solutions, and batch/stream
processing systems. The ideal candidate should be proficient in Spark, Kafka, Python, and
AWS Big Data services, with hands-on experience in implementing CDC (Change Data
Capture) pipelines and integrating multiple data sources and sinks.
Responsibilities
- Design, develop, and optimize batch and streaming data pipelines using Apache Spark and Python.
- Build and maintain real-time data ingestion pipelines leveraging Kafka and AWS Kinesis.
- Implement CDC (Change Data Capture) pipelines using Kafka Connect, Debezium or similar frameworks.
- Integrate data from multiple sources and sinks (databases, APIs, message queues, file systems, cloud storage).
- Work with AWS Big Data ecosystem: Glue, EMR, Kinesis, Athena, S3, Lambda, Step Functions.
- Ensure pipeline scalability, reliability, and performance tuning of Spark jobs and EMR clusters.
- Develop data transformation and ETL workflows in AWS Glue and manage schema evolution.
- Collaborate with data scientists, analysts, and product teams to deliver reliable and high-quality data solutions.
- Implement monitoring, logging, and alerting for critical data pipelines.
- Follow best practices for data security, compliance, and cost optimization in cloud environments.
Required Skills & Experience
- Programming: Strong proficiency in Python (PySpark, data frameworks, automation).
- Big Data Processing: Hands-on experience with Apache Spark (batch & streaming).
- Messaging & Streaming: Proficient in Kafka (brokers, topics, partitions, consumer groups) and AWS Kinesis.
- CDC Pipelines: Experience with Debezium / Kafka Connect / custom CDC frameworks.
- AWS Services: AWS Glue, EMR, S3, Athena, Lambda, IAM, CloudWatch.
- ETL/ELT Workflows: Strong knowledge of data ingestion, transformation, partitioning, schema management.
- Databases: Experience with relational databases (MySQL, Postgres, Oracle) and NoSQL (MongoDB, DynamoDB, Cassandra).
- Data Formats: JSON, Parquet, Avro, ORC, Delta/Iceberg/Hudi.
- Version Control & CI/CD: Git, GitHub/GitLab, Jenkins, or CodePipeline.
- Monitoring/Logging: CloudWatch, Prometheus, ELK/Opensearch.
- Containers & Orchestration (nice-to-have): Docker, Kubernetes, Airflow/Step
- Functions for workflow orchestration.
Preferred Qualifications
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
- Experience in large-scale data lake / lake house architectures.
- Knowledge of data warehousing concepts and query optimisation.
- Familiarity with data governance, lineage, and cataloging tools (Glue Data Catalog, Apache Atlas).
- Exposure to ML/AI data pipelines is a plus.
Tools & Technologies (must-have exposure)
- Big Data & Processing: Apache Spark, PySpark, AWS EMR, AWS Glue
- Streaming & Messaging: Apache Kafka, Kafka Connect, Debezium, AWS Kinesis
- Cloud & Storage: AWS (S3, Athena, Lambda, IAM, CloudWatch)
- Programming & Scripting: Python, SQL, Bash
- Orchestration: Airflow / Step Functions
- Version Control & CI/CD: Git, Jenkins/CodePipeline
- Data Formats: Parquet, Avro, ORC, JSON, Delta, Iceberg, Hudi

MANDATORY:
- Super Quality Data Architect, Data Engineering Manager / Director Profile
- Must have 12+ YOE in Data Engineering roles, with at least 2+ years in a Leadership role
- Must have 7+ YOE in hands-on Tech development with Java (Highly preferred) or Python, Node.JS, GoLang
- Must have strong experience in large data technologies, tools like HDFS, YARN, Map-Reduce, Hive, Kafka, Spark, Airflow, Presto etc.
- Strong expertise in HLD and LLD, to design scalable, maintainable data architectures.
- Must have managed a team of at least 5+ Data Engineers (Read Leadership role in CV)
- Product Companies (Prefers high-scale, data-heavy companies)
PREFERRED:
- Must be from Tier - 1 Colleges, preferred IIT
- Candidates must have spent a minimum 3 yrs in each company.
- Must have recent 4+ YOE with high-growth Product startups, and should have implemented Data Engineering systems from an early stage in the Company
ROLES & RESPONSIBILITIES:
- Lead and mentor a team of data engineers, ensuring high performance and career growth.
- Architect and optimize scalable data infrastructure, ensuring high availability and reliability.
- Drive the development and implementation of data governance frameworks and best practices.
- Work closely with cross-functional teams to define and execute a data roadmap.
- Optimize data processing workflows for performance and cost efficiency.
- Ensure data security, compliance, and quality across all data platforms.
- Foster a culture of innovation and technical excellence within the data team.
IDEAL CANDIDATE:
- 10+ years of experience in software/data engineering, with at least 3+ years in a leadership role.
- Expertise in backend development with programming languages such as Java, PHP, Python, Node.JS, GoLang, JavaScript, HTML, and CSS.
- Proficiency in SQL, Python, and Scala for data processing and analytics.
- Strong understanding of cloud platforms (AWS, GCP, or Azure) and their data services.
- Strong foundation and expertise in HLD and LLD, as well as design patterns, preferably using Spring Boot or Google Guice
- Experience in big data technologies such as Spark, Hadoop, Kafka, and distributed computing frameworks.
- Hands-on experience with data warehousing solutions such as Snowflake, Redshift, or BigQuery
- Deep knowledge of data governance, security, and compliance (GDPR, SOC2, etc.).
- Experience in NoSQL databases like Redis, Cassandra, MongoDB, and TiDB.
- Familiarity with automation and DevOps tools like Jenkins, Ansible, Docker, Kubernetes, Chef, Grafana, and ELK.
- Proven ability to drive technical strategy and align it with business objectives.
- Strong leadership, communication, and stakeholder management skills.
PREFERRED QUALIFICATIONS:
- Experience in machine learning infrastructure or MLOps is a plus.
- Exposure to real-time data processing and analytics.
- Interest in data structures, algorithm analysis and design, multicore programming, and scalable architecture.
- Prior experience in a SaaS or high-growth tech company.

Required Skills:
· 8+ years of being a practitioner in data engineering or a related field.
· Proficiency in programming skills in Python
· Experience with data processing frameworks like Apache Spark or Hadoop.
· Experience working on Databricks.
· Familiarity with cloud platforms (AWS, Azure) and their data services.
· Experience with data warehousing concepts and technologies.
· Experience with message queues and streaming platforms (e.g., Kafka).
· Excellent communication and collaboration skills.
· Ability to work independently and as part of a geographically distributed team.

Job Title : Full Stack Engineer (Real-Time Audio Systems) – Voice AI
Experience : 4+ Years
Location : Gurgaon (Hybrid)
About the Role :
We’re looking for a Voice AI / Full Stack Engineer to build our real-time Voice AI platform for low-latency, intelligent voice-driven agents in healthcare and beyond.
You’ll work closely with the founding team, combining audio infrastructure, AI, and full stack development to deliver natural, production-grade voice experiences.
Hands-on experience with WebRTC, WebSocket, and streaming services is required.
Experience with TTS (Text-to-Speech) and STT (Speech-to-Text) modules is a strong plus.
Mandatory Skills :
Python (FastAPI, Async frameworks, LangChain), WebRTC, WebSockets, Redis, Kafka, Docker, AWS, real-time streaming systems, TTS/STT modules.
Responsibilities :
- Build and optimize voice-driven AI systems integrating ASR, TTS, and LLM inference with WebRTC & WebSocket.
- Develop scalable backend APIs and streaming pipelines for real-time communication.
- Translate AI audio models into reliable, production-ready services.
- Collaborate across teams for rapid prototyping and deployment.
- Monitor and improve system performance, latency, and reliability.
Requirements :
- 4+ years of experience in real-time systems, streaming, or conversational AI.
- Strong in Python (FastAPI, Async frameworks, LangChain).
- Hands-on with WebRTC, WebSockets, Redis, Kafka, Docker, AWS.
- Familiarity with Next.js or similar frontend frameworks is a plus.
- Experience in healthcare tech or regulated domains preferred.
Bonus Skills :
- Contributions to open-source audio/media projects.
- Background in DSP, live streaming, or media infrastructure.
- Familiarity with Grafana, Prometheus, or other observability tools.
Why Join Us :
Be part of a team working at the intersection of AI research and product engineering, shaping next-gen voice intelligence for real-world applications.
Own your systems end-to-end, innovate fast, and make a direct impact in healthcare AI.
Interview Process :
- Screening & Technical Task
- Technical Discussion
- HR/Leadership Round

Senior Python Django Developer
Experience: Back-end development: 6 years (Required)
Location: Bangalore/ Bhopal
Job Description:
We are looking for a highly skilled Senior Python Django Developer with extensive experience in building and scaling financial or payments-based applications. The ideal candidate has a deep understanding of system design, architecture patterns, and testing best practices, along with a strong grasp of the startup environment.
This role requires a balance of hands-on coding, architectural design, and collaboration across teams to deliver robust and scalable financial products.
Responsibilities:
- Design and develop scalable, secure, and high-performance applications using Python (Django framework).
- Architect system components, define database schemas, and optimize backend services for speed and efficiency.
- Lead and implement design patterns and software architecture best practices.
- Ensure code quality through comprehensive unit testing, integration testing, and participation in code reviews.
- Collaborate closely with Product, DevOps, QA, and Frontend teams to build seamless end-to-end solutions.
- Drive performance improvements, monitor system health, and troubleshoot production issues.
- Apply domain knowledge in payments and finance, including transaction processing, reconciliation, settlements, wallets, UPI, etc.
- Contribute to technical decision-making and mentor junior developers.
Requirements:
- 6 to 10 years of professional backend development experience with Python and Django.
- Strong background in payments/financial systems or FinTech applications.
- Proven experience in designing software architecture in a microservices or modular monolith environment.
- Experience working in fast-paced startup environments with agile practices.
- Proficiency in RESTful APIs, SQL (PostgreSQL/MySQL), NoSQL (MongoDB/Redis).
- Solid understanding of Docker, CI/CD pipelines, and cloud platforms (AWS/GCP/Azure).
- Hands-on experience with test-driven development (TDD) and frameworks like pytest, unittest, or factory_boy.
- Familiarity with security best practices in financial applications (PCI compliance, data encryption, etc.).
Preferred Skills:
- Exposure to event-driven architecture (Celery, Kafka, RabbitMQ).
- Experience integrating with third-party payment gateways, banking APIs, or financial instruments.
- Understanding of DevOps and monitoring tools (Prometheus, ELK, Grafana).
- Contributions to open-source or personal finance-related projects.
Job Types: Full-time, Permanent
Schedule:
- Day shift
Supplemental Pay:
- Performance bonus
- Yearly bonus
Ability to commute/relocate:
- JP Nagar, 5th Phase, Bangalore, Karnataka or Indrapuri, Bhopal, Madhya Pradesh: Reliably commute or willing to relocate with an employer-provided relocation package (Preferred)
Job Title- Senior Java Developer
Required Experience- 8-10 years
Location- Bangalore-Hybrid
Desired Skills- Java, microservices, Docker, Kubernetes
Good to Have - Kotlin, Kafka
Job Type- Full time
Your key responsibilities:
● Build systems to add features across our messaging platform
● Create scalable microservices that will help support our impressive growth
● Propose scalable, maintainable and cost-effective solutions
● Estimate the effort required to develop and implement new features
● Develop and execute effective testing strategies, including unit tests, integration tests,
and end-to-end tests, to ensure software quality and reliability
● Collaborate with cross-functional teams to ensure seamless integration and functionality.
What you’ll bring
Required:
● Excellent software engineering skills in Java for backend services
● A strong focus on testing, with a customer-first mindset
● Experience with multi-threaded architectures, web services, caching, and event-driven
pipelines.
● Practical experience in deploying code in a modern programming environment using
tools such as Docker and Kubernetes
● Strong emphasis on testing and familiarity with Object-Oriented Design and design
patterns.
Desired Competencies (Technical/Behavioral Competency)
Must-Have
· Strong understanding of Kafka concepts, including topics, partitions, consumers, producers, and security.
· Experience with testing Kafka Connect, Kafka Streams, and other Kafka ecosystem components.
· API Testing Experience
· X-RAY and Test Automation Experience
· Expertise with Postman/SOAP
· Agile/JIRA/Confluence
· Strong familiarity such as XML, JSON, CSV, Avro, etc.
· Strong hands-on SQL, Mongo.
· Continuous integration and automated testing.
· Working knowledge and experience of Git.
Good-to-Have
· Troubleshoot Kafka related issues, Strong in Kafka client configuration and troubleshooting
SN
Responsibility of / Expectations from the Role
1
Engage with the customer to understand the requirements, provide technical solutions, provide value added suggestions
2
Help build and manage the team of Kafka and Java developers in the near future.

Responsibilities • Design, develop, and maintain backend systems and RESTful APIs using Python (Django, FastAPI, or Flask)• Build real-time communication features using WebSockets, SSE, and async IO • Implement event-driven architectures using messaging systems like Kafka, RabbitMQ, Redis Streams, or NATS • Develop and maintain microservices that interact over messaging and streaming protocols • Ensure high scalability and availability of backend services • Collaborate with frontend developers, DevOps engineers, and product managers to deliver end-to-end solutions • Write clean, maintainable code with unit/integration tests • Lead technical discussions, review code, and mentor junior engineers
Requirements • 8+ years of backend development experience, with at least 8 years in Python • Strong experience with asynchronous programming in Python (e.g., asyncio, aiohttp, FastAPI) • Production experience with WebSockets and Server-Sent Events • Hands-on experience with at least one messaging system: Kafka, RabbitMQ, Redis Pub/Sub, or similar • Proficient in RESTful API design and microservices architecture • Solid experience with relational and NoSQL databases • Familiarity with Docker and container-based deployment • Strong understanding of API security, authentication, and performance optimization
Nice to Have • Experience with GraphQL or gRPC • Familiarity with stream processing frameworks (e.g., Apache Flink, Spark Streaming) • Cloud experience (AWS, GCP, Azure), particularly with managed messaging or pub/sub services • Knowledge of CI/CD and infrastructure as code • Exposure to AI engineering workflows and tools • Interest or experience in building Agentic AI systems or integrating backends with AI agents


Backend Engineer Python / Golang / Rust
Location : Bangalore, India
Experience Required : 2-3 years minimum
Job Overview
We are seeking a skilled Backend Engineer with expertise in Python, Golang, or Rust to join our engineering team. The ideal candidate will have hands-on experience in building and maintaining enterprise-level, scalable backend services using microservices architecture.
Key Requirements :
Technical Skills :
- Programming Expertise : Advanced proficiency in Python with strong knowledge of Django, FastAPI, or Flask, OR expertise in Golang or Rust for backend development.
- Microservices Architecture : Solid experience in designing, developing, and maintaining microservices-based systems.
- Database Management : Hands-on experience with PostgreSQL, MySQL, MongoDB, including database design and optimization.
- API Development : Strong experience in designing and implementing RESTful APIs and GraphQL services.
- Cloud Platforms : Proficiency with AWS, GCP, or Azure services for deployment and scaling.
- Containerization & Orchestration : Strong knowledge of Docker and Kubernetes for scalable deployments.
- Messaging & Caching : Experience with Redis, RabbitMQ, Apache Kafka, and caching strategies (Redis, Memcached).
- Version Control : Advanced Git workflows and team collaboration best practices.
Experience Requirements :
- Minimum 2-3 years of backend development experience.
- Proven track record of working on enterprise-level, production-grade applications.
- Strong background in microservices architecture and distributed systems.
- Experience in building scalable systems capable of handling high traffic loads.
- Familiarity with CI/CD pipelines, DevOps practices, and cloud-native deployments.
Responsibilities :
- Design, develop, and maintain robust, scalable backend services and APIs.
- Architect and implement microservices solutions ensuring modularity and resilience.
- Optimize application performance, database queries, and service scalability.
- Collaborate closely with frontend teams, product managers, and DevOps engineers.
- Implement security best practices and data protection measures.
- Write and maintain comprehensive unit and integration tests.
- Participate actively in code reviews and architectural discussions.
- Monitor, debug, and optimize system performance in production environments.
Preferred Qualifications :
- Strong understanding of software architecture patterns (event-driven, CQRS, hexagonal, etc.).
- Experience with Agile/Scrum methodologies.
- Contributions to open-source projects or personal backend projects.
- Experience with observability tools (Prometheus, Grafana, ELK, Jaeger).
Job Summary:
We are seeking passionate Developers with experience in Microservices architecture to join our team in Noida. The ideal candidate should have hands-on expertise in Java, Spring Boot, Hibernate, and front-end technologies like Angular, JavaScript, and Bootstrap. You will be responsible for developing enterprise-grade software applications that enhance patient safety worldwide.
Key Responsibilities:
- Develop and maintain applications using Microservices architecture.
- Work with modern technologies like Java, Spring Boot, Hibernate, Angular, Kafka, Redis, and Hazelcast.
- Utilize AWS, Git, Nginx, Tomcat, Oracle, Jira, Confluence, and Jenkins for development and deployment.
- Collaborate with cross-functional teams to design and build scalable enterprise applications.
- Develop intuitive UI/UX components using Bootstrap, jQuery, and JavaScript.
- Ensure high-performance, scalable, and secure applications for Fortune 100 pharmaceutical companies.
- Participate in Agile development, managing changing priorities effectively.
- Conduct code reviews, troubleshoot issues, and optimize application performance.
Required Skills & Qualifications:
- 5+ years of hands-on experience in Java 7/8, Spring Boot, and Hibernate.
- Strong knowledge of OOP concepts and Design Patterns.
- Experience working with relational databases (Oracle/MySQL).
- Proficiency in Bootstrap, JavaScript, jQuery, HTML, and Angular.
- Hands-on experience in Microservices-based application development.
- Strong problem-solving, debugging, and analytical skills.
- Excellent communication and collaboration abilities.
- Ability to adapt to new technologies and manage multiple priorities.
- Experience in developing high-quality web applications.
Good to Have:
- Exposure to Kafka, Redis, and Hazelcast.
- Experience working with cloud-based solutions (AWS preferred).
- Familiarity with DevOps tools like Jenkins, Docker, and Kubernetes.
We are looking for a highly skilled Solution Architect with a passion for software engineering and deep experience in backend technologies, cloud, and DevOps. This role will be central in managing, designing, and delivering large-scale, scalable solutions.
Core Skills
- Strong coding and software engineering fundamentals.
- Experience in large-scale custom-built applications and platforms.
- Champion of SOLID principles, OO design, and pair programming.
- Agile, Lean, and Continuous Delivery – CI, TDD, BDD.
- Frontend experience is a plus.
- Hands-on with Java, Scala, Golang, Rust, Spark, Python, and JS frameworks.
- Experience with Docker, Kubernetes, and Infrastructure as Code.
- Excellent understanding of cloud technologies – AWS, GCP, Azure.
Responsibilities
- Own all aspects of technical development and delivery.
- Understand project requirements and create architecture documentation.
- Ensure adherence to development best practices through code reviews.
- Experience comparable to DevOps SIRE providing SME-tevel application or platform support with responsibility for designing and automating operational procedures and best practices
-Experience writing python and shell scripts to perform health checks and automations
- Experience with Linux System Administration (preferably Red Hat)
- Hands-on experience with multi-tenant hosting environments for middleware applications (for example: centrally managed platform or infrastructure as a service)
- Experience with implementing observabitity, monitoring and alerting tools
- Excellent written and oral English communication skills. The candidate must write user-facing documentation, prepare and deliver presentations to an internal audience and effectively interact with upper management, colleagues, and customers
- Independent problem-solving skills, self-motivated, and a mindset for taking ownership
- A minimum of 5 years of infrastructure production support or DevOps experience
Additional Technical Skills
Experience with broker-based messaging infrastructure such as Apache Kafka, IBM MQ (or similar technology like ActiveMQ, Azure Service Bus) including configuration and performance tuning
Experience with public/private cloud and containerization technologies (e.g. Kubernetes)
Experience with Agile development methodology, Cl/CD and automated build pipelines
Experience with DevOps methodology (e.g. Phoenix Project)
Experience with tools such as Jira, Confluence and ServiceNow
Experience working with JSON, XML, Google Protocol Buffers, Avro, FIX
Experience with troubleshooting tools such as TCPdump and Wireshark
Experience with NoSQL databases such as MongoDB and Redis interest and understanding of emerging IT trends
Experience with system architecture design
Title: Data Platform / Database Architect (Postgres + Kafka) — AI‑Ready Data Infrastructure
Location: Noida (Hybrid). Remote within IST±3 considered for exceptional candidates.
Employment: Full‑time
About Us
We are building a high‑throughput, audit‑friendly data platform that powers a SaaS for financial data automation and reconciliation. The stack blends OLTP (Postgres), streaming (Kafka/Debezium), and OLAP (ClickHouse/Snowflake/BigQuery), with hooks for AI use‑cases (vector search, feature store, RAG).
Role Summary
Own the end‑to‑end design and performance of our data platform—from multi‑tenant Postgres schemas to CDC pipelines and analytics stores—while laying the groundwork for AI‑powered product features.
What You’ll Do
• Design multi‑tenant Postgres schemas (partitioning, indexing, normalization, RLS), and define retention/archival strategies.
• Make Postgres fast and reliable: EXPLAIN/ANALYZE, connection pooling, vacuum/bloat control, query/index tuning, replication.
• Build event‑streaming/CDC with Kafka/Debezium (topics, partitions, schema registry), and deliver data to ClickHouse/Snowflake/BigQuery.
• Model analytics layers (star/snowflake), orchestrate jobs (Airflow/Dagster), and implement dbt‑based transformations.
• Establish observability and SLOs for data: query/queue metrics, tracing, alerting, capacity planning.
• Implement data security: encryption, masking, tokenization of PII, IAM boundaries; contribute to PCI‑like audit posture.
• Integrate AI plumbing: vector embeddings (pgvector/Milvus), basic feature‑store patterns (Feast), retrieval pipelines and metadata lineage.
• Collaborate with backend/ML/product to review designs, coach engineers, write docs/runbooks, and lead migrations.
Must‑Have Qualifications
• 6+ years building high‑scale data platforms with deep PostgreSQL experience (partitioning, advanced indexing, query planning, replication/HA).
• Hands‑on with Kafka (or equivalent) and Debezium/CDC patterns; schema registry (Avro/Protobuf) and exactly‑once/at‑least‑once tradeoffs.
• One or more analytics engines at scale: ClickHouse, Snowflake, or BigQuery, plus strong SQL.
• Python for data tooling (pydantic, SQLAlchemy, or similar); orchestration with Airflow or Dagster; transformations with dbt.
• Solid cloud experience (AWS/GCP/Azure)—networking, security groups/IAM, secrets management, cost controls.
• Pragmatic performance engineering mindset; excellent communication and documentation.
Nice‑to‑Have
• Vector/semantic search (pgvector/Milvus/Pinecone), feature store (Feast), or RAG data pipelines.
• Experience in fintech‑style domains (reconciliation, ledgers, payments) and SOX/PCI‑like controls.
• Infra‑as‑Code (Terraform), containerized services (Docker/K8s), and observability stacks (Prometheus/Grafana/OpenTelemetry).
• Exposure to Go/Java for stream processors/consumers.
• Lakehouse formats (Delta/Iceberg/Hudi).
Job Title: Senior Java Developer – Multi-SaaS / Microservices / Pub-Sub
About the Role
We are seeking a highly skilled Senior Java Developer with strong leadership abilities to join our team. The ideal candidate will have deep expertise in Java, micro services architecture, multi-tenant SaaS systems, pub/sub messaging, and cloud-based deployments on AWS EKS. This role requires not only technical mastery but also the ability to mentor teams, influence architecture decisions, and ensure best practices through Test-Driven Development (TDD).
Key Responsibilities
- Lead design and development of Java-based multi-tenant SaaS applications using microservices architecture.
- Implement pub/sub messaging systems for event-driven communication.
- Deploy, monitor, and optimize services on AWS EKS.
- Ensure system reliability, scalability, and security using AWS CloudWatch and other observability tools.
- Apply TDD principles to drive high-quality, maintainable code.
- Collaborate with product managers, architects, and other developers to translate requirements into technical solutions.
- Lead code reviews, mentor junior engineers, and promote engineering best practices.
- Take ownership of end-to-end delivery, from architecture design to production rollout.
Required Skills & Qualifications
- Proficiency in Java with 6+ years of professional development experience.
- Strong background in microservices architecture and multi-tenant SaaS systems.
- Expertise in pub/sub messaging patterns (e.g., Kafka, RabbitMQ, or AWS SNS/SQS).
- Hands-on experience with AWS EKS and container orchestration (Kubernetes).
- Strong understanding of monitoring and logging using AWS CloudWatch or similar tools.
- Proven experience applying Test-Driven Development (TDD) in real-world projects.
- Demonstrated leadership skills – mentoring, decision-making, and driving engineering excellence.
- Solid understanding of software engineering best practices, CI/CD, and version control (Git).
1 Senior Associate Technology L1 – Java Microservices
Company Description
Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value.
Job Description
We are looking for a Senior Associate Technology Level 1 - Java Microservices Developer to join our team of bright thinkers and doers. You’ll use your problem-solving creativity to design, architect, and develop high-end technology solutions that solve our clients’ most complex and challenging problems across different industries.
We are on a mission to transform the world, and you will be instrumental in shaping how we do it with your ideas, thoughts, and solutions.
Your Impact:
• Drive the design, planning, and implementation of multifaceted applications, giving you breadth and depth of knowledge across the entire project lifecycle.
• Combine your technical expertise and problem-solving passion to work closely with clients, turning • complex ideas into end-to-end solutions that transform our clients’ business
• Constantly innovate and evaluate emerging technologies and methods to provide scalable and elegant solutions that help clients achieve their business goals.
Qualifications
➢ 5 to 7 Years of software development experience
➢ Strong development skills in Java JDK 1.8 or above
➢ Java fundamentals like Exceptional handling, Serialization/Deserialization and Immutability concepts
➢ Good fundamental knowledge in Enums, Collections, Annotations, Generics, Auto boxing and Data Structure
➢ Database RDBMS/No SQL (SQL, Joins, Indexing)
➢ Multithreading (Re-entrant Lock, Fork & Join, Sync, Executor Framework)
➢ Spring Core & Spring Boot, security, transactions ➢ Hands-on experience with JMS (ActiveMQ, RabbitMQ, Kafka etc)
➢ Memory Mgmt (JVM configuration, Profiling, GC), profiling, Perf tunning, Testing, Jmeter/similar tool)
➢ Devops (CI/CD: Maven/Gradle, Jenkins, Quality plugins, Docker and containersization)
➢ Logical/Analytical skills. Thorough understanding of OOPS concepts, Design principles and implementation of
➢ different type of Design patterns. ➢ Hands-on experience with any of the logging frameworks (SLF4J/LogBack/Log4j) ➢ Experience of writing Junit test cases using Mockito / Powermock frameworks.
➢ Should have practical experience with Maven/Gradle and knowledge of version control systems like Git/SVN etc.
➢ Good communication skills and ability to work with global teams to define and deliver on projects.
➢ Sound understanding/experience in software development process, test-driven development.
➢ Cloud – AWS / AZURE / GCP / PCF or any private cloud would also be fine
➢ Experience in Microservices
Role & Responsibilities :
- Lead the design, analysis, and implementation of technical solutions.
- Take full ownership of product features.
- Participate in detailed discussions with the product management team regarding requirements.
- Work closely with the engineering team to design and implement scalable solutions.
- Create detailed functional and technical specifications.
- Follow Test-Driven Development (TDD) and deliver high-quality code.
- Communicate proactively with your manager regarding risks and progress.
- Mentor junior team members and provide technical guidance.
- Troubleshoot and resolve production issues with RCA and long-term solutions
Required Skills & Experience :
- Bachelors/Masters degree in Computer Science or related field with a solid academic track record.
- 6+ years of hands-on experience in backend development for large-scale enterprise products.
- Strong programming skills in Java; familiarity with Python is a plus.
- Deep understanding of data structures, algorithms, and problem-solving.
- Proficient in Spring Boot and RESTful APIs.
- Experience with cloud technologies like ElasticSearch, Kafka, MongoDB, Hazelcast, Ceph, etc.
- Strong experience in building scalable, concurrent applications.
- Exposure to Service-Oriented Architecture (SOA) and Test-Driven Development (TDD).
- Excellent communication and collaboration skills.
Preferred Technologies :
- Java
- Spring Boot, J2EE
- ElasticSearch
- Kafka
- MongoDB, Ceph
- AWS
- Storm, Hazelcast
- TDD, SOA

We are seeking a highly skilled React JS Developer with exceptional DOM manipulation expertise and real-time data handling experience to join our team. You'll be building and optimizing high-performance user interfaces for stock market trading applications where milliseconds matter and data flows continuously.
The ideal candidate thrives in fast-paced environments, understands the intricacies of browser performance, and has hands-on experience with WebSockets and real-time data streaming architectures.
Key Responsibilities
Core Development
- Advanced DOM Operations: Implement complex, performance-optimized DOM manipulations for real-time trading interfaces
- Real-time Data Management: Build robust WebSocket connections and handle high-frequency data streams with minimal latency
- Performance Engineering: Create lightning-fast, scalable front-end applications that process thousands of market updates per second
- Custom Component Architecture: Design and build reusable, high-performance React components optimized for trading workflows
Collaboration & Integration
- Work closely with traders, quants, and backend developers to translate complex trading requirements into intuitive interfaces
- Collaborate with UX/UI designers and product managers to create responsive, trader-focused experiences
- Integrate with real-time market data APIs and trading execution systems
Technical Excellence
- Implement sophisticated data visualizations and interactive charts using libraries like Chartjs, TradingView, or custom D3.js solutions
- Ensure cross-browser compatibility and responsiveness across multiple devices and screen sizes
- Debug and resolve complex performance issues, particularly in real-time data processing and rendering
- Maintain high-quality code through reviews, testing, and comprehensive documentation
Required Skills & Experience
React & JavaScript Mastery
- 5+ years of professional React.js development with deep understanding of React internals, hooks, and advanced patterns
- Expert-level JavaScript (ES6+) with strong proficiency in asynchronous programming, closures, and memory management
- Advanced HTML5 & CSS3 skills with focus on performance and cross-browser compatibility
Real-time & Performance Expertise
- Proven experience with WebSockets and real-time data streaming protocols
- Strong DOM manipulation skills - direct DOM access, virtual scrolling, efficient updates, and performance optimization
- RESTful API integration with experience in handling high-frequency data feeds
- Browser performance optimization - understanding of rendering pipeline, memory management, and profiling tools
Development Tools & Practices
- Proficiency with modern build tools: Webpack, Babel, Vite, or similar
- Experience with Git version control and collaborative development workflows
- Agile/Scrum development environment experience
- Understanding of testing frameworks (Jest, React Testing Library)
Financial Data Visualization
- Experience with financial charting libraries: Chartjs, TradingView, D3.js, or custom visualization solutions
- Understanding of market data structures, order books, and trading terminology
- Knowledge of data streaming optimization techniques for financial applications
Nice-to-Have Skills
Domain Expertise
- Prior experience in stock market, trading, or financial services - understanding of trading workflows, order management, risk systems
- Algorithmic trading knowledge or exposure to quantitative trading systems
- Financial market understanding - equities, derivatives, commodities
Technical Plus Points
- Backend development experience with GoLang, Python, or Node.js
- Database knowledge: SQL, NoSQL, time-series databases (InfluxDB, TimescaleDB)
- Cloud platform experience: AWS, Azure, GCP for deploying scalable applications
- Message queue systems: Redis, RabbitMQ, Kafka, NATS for real-time data processing
- Microservices architecture understanding and API design principles
Advanced Skills
- Service Worker implementation for offline-first applications
- Progressive Web App (PWA) development
- Mobile-first responsive design expertise
Qualifications
- Bachelor's degree in Computer Science, Engineering, or related field (or equivalent professional experience)
- 5+ years of professional React.js development with demonstrable experience in performance-critical applications
- Portfolio or examples of complex real-time applications you've built
- Financial services experience strongly preferred
Why You'll Love Working Here
We're a team that hustles—plain and simple. But we also believe life outside work matters. No cubicles, no suits—just great people doing great work in a space built for comfort and creativity.
What We Offer
💰 Competitive salary – Get paid what you're worth
🌴 Generous paid time off – Recharge and come back sharper
🌍 Work with the best – Collaborate with top-tier global talent
✈️ Adventure together – Annual offsites (mostly outside India) and regular team outings
🎯 Performance rewards – Multiple bonuses for those who go above and beyond
🏥 Health covered – Comprehensive insurance so you're always protected
⚡ Fun, not just work – On-site sports, games, and a lively workspace
🧠 Learn and lead – Regular knowledge-sharing sessions led by your peers
📚 Annual Education Stipend – Take any external course, bootcamp, or certification that makes you better at your craft
🏋️ Stay fit – Gym memberships with equal employer contribution to keep you at your best
🚚 Relocation support – Smooth move? We've got your back
🏆 Friendly competition – Work challenges and extracurricular contests to keep things exciting
We work hard, play hard, and grow together. Join us.
Key Responsibilities:-
- Design, build, and enhance Salesforce applications using Apex, Lightning Web Components (LWC), Visualforce, and SOQL.
- Implement integrations with external systems using REST APIs and event-driven messaging (e.g., Kafka).
- Collaborate with architects and business analysts to translate requirements into scalable, maintainable solutions.
- Establish and follow engineering best practices, including source control (Git), code reviews, branching strategies, CI/CD pipelines, automated testing, and environment management.
- Establish and maintain Azure DevOps-based workflows (repos, pipelines, automated testing) for Salesforce engineering.
- Ensure solutions follow Salesforce security, data modeling, and performance guidelines.
- Participate in Agile ceremonies, providing technical expertise and leadership within sprints and releases.
- Optimize workflows, automations, and data processes across Sales Cloud, Service Cloud, and custom Salesforce apps.
- Provide technical mentoring and knowledge sharing when required.
- Support production environments, troubleshoot issues, and drive root-cause analysis for long-term reliability.
- Stay current on Salesforce platform updates, releases, and new features, recommending adoption where beneficial.
Required Qualifications:-
- Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience).
- 6+ years of Salesforce development experience with strong knowledge of Apex, Lightning Web Components, and Salesforce APIs.
- Proven experience with Salesforce core clouds (Sales Cloud, Service Cloud, or equivalent).
- Strong hands-on experience with API integrations (REST/SOAP) and event-driven architectures (Kafka, Pub/Sub).
- Solid understanding of engineering practices: Git-based source control (Salesforce DX/metadata), branching strategies, CI/CD, automated testing, and deployment management.
- Familiarity with Azure DevOps repositories and pipelines.
- Strong knowledge of Salesforce data modeling, security, and sharing rules.
- Excellent problem-solving skills and ability to collaborate across teams.
Preferred Qualifications:-
- Salesforce Platform Developer II certification (or equivalent advanced credentials).
- Experience with Health Cloud, Financial Services Cloud, or other industry-specific Salesforce products.
- Experience implementing logging, monitoring, and observability within Salesforce and integrated systems.
- Background in Agile/Scrum delivery with strong collaboration skills.
- Prior experience establishing or enforcing engineering standards across Salesforce teams.


Senior Software Engineer
Challenge convention and work on cutting edge technology that is transforming the way our customers manage their physical, virtual and cloud computing environments. Virtual Instruments seeks highly talented people to join our growing team, where your contributions will impact the development and delivery of our product roadmap. Our award-winning Virtana Platform provides the only real-time, system-wide, enterprise scale solution for providing visibility into performance, health and utilization metrics, translating into improved performance and availability while lowering the total cost of the infrastructure supporting mission-critical applications.
We are seeking an individual with expert knowledge in Systems Management and/or Systems Monitoring Software, Observability platforms and/or Performance Management Software and Solutions with insight into integrated infrastructure platforms like Cisco UCS, infrastructure providers like Nutanix, VMware, EMC & NetApp and public cloud platforms like Google Cloud and AWS to expand the depth and breadth of Virtana Products.
Work Location: Pune/ Chennai
Job Type:Hybrid
Role Responsibilities:
- The engineer will be primarily responsible for architecture, design and development of software solutions for the Virtana Platform
- Partner and work closely with cross functional teams and with other engineers and product managers to architect, design and implement new features and solutions for the Virtana Platform.
- Communicate effectively across the departments and R&D organization having differing levels of technical knowledge.
- Work closely with UX Design, Quality Assurance, DevOps and Documentation teams. Assist with functional and system test design and deployment automation
- Provide customers with complex and end-to-end application support, problem diagnosis and problem resolution
- Learn new technologies quickly and leverage 3rd party libraries and tools as necessary to expedite delivery
Required Qualifications:
- Minimum of 7+ years of progressive experience with back-end development in a Client Server Application development environment focused on Systems Management, Systems Monitoring and Performance Management Software.
- Deep experience in public cloud environment using Kubernetes and other distributed managed services like Kafka etc (Google Cloud and/or AWS)
- Experience with CI/CD and cloud-based software development and delivery
- Deep experience with integrated infrastructure platforms and experience working with one or more data collection technologies like SNMP, REST, OTEL, WMI, WBEM.
- Minimum of 6 years of development experience with one or more of these high level languages like GO, Python, Java. Deep experience with one of these languages is required.
- Bachelor’s or Master’s degree in computer science, Computer Engineering or equivalent
- Highly effective verbal and written communication skills and ability to lead and participate in multiple projects
- Well versed with identifying opportunities and risks in a fast-paced environment and ability to adjust to changing business priorities
- Must be results-focused, team-oriented and with a strong work ethic
Desired Qualifications:
- Prior experience with other virtualization platforms like OpenShift is a plus
- Prior experience as a contributor to engineering and integration efforts with strong attention to detail and exposure to Open-Source software is a plus
- Demonstrated ability as a lead engineer who can architect, design and code with strong communication and teaming skills
- Deep development experience with the development of Systems, Network and performance Management Software and/or Solutions is a plus
About Virtana: Virtana delivers the industry’s only broadest and deepest Observability Platform that allows organizations to monitor infrastructure, de-risk cloud migrations, and reduce cloud costs by 25% or more.
Over 200 Global 2000 enterprise customers, such as AstraZeneca, Dell, Salesforce, Geico, Costco, Nasdaq, and Boeing, have valued Virtana’s software solutions for over a decade.
Our modular platform for hybrid IT digital operations includes Infrastructure Performance Monitoring and Management (IPM), Artificial Intelligence for IT Operations (AIOps), Cloud Cost Management (Fin Ops), and Workload Placement Readiness Solutions. Virtana is simplifying the complexity of hybrid IT environments with a single cloud-agnostic platform across all the categories listed above. The $30B IT Operations Management (ITOM) Software market is ripe for disruption, and Virtana is uniquely positioned for success.

Data Analytics Lead
Responsibilities:
· Oversee the design, development, and implementation of data analysis solutions to meet business needs.
· Work closely with business stakeholders and the Aviation SME to define data requirements, project scope, and deliverables.
· Drive the design and development of analytics data models and data warehouse designs.
· Develop and maintain data quality standards and procedures.
· Manage and prioritize data analysis projects, ensuring timely completion.
· Identify opportunities to improve data analysis processes and tools.
· Collaborate with Data Engineers and Data Architects to ensure data solutions align with the overall data platform architecture.
· Evaluate and recommend new data analysis tools and technologies.
· Contribute to the development of best practices for data analysis.
· Participate in project meetings and provide input on data-related issues, risks and requirements.
Qualifications
· 8+ years of experience as a Data Analytics Lead, with experience leading or mentoring a team.
· Extensive experience with cloud-based data modelling and data warehousing solutions, using Azure Data Bricks.
· Proven experience in data technologies and platforms, ETL processes and tools, preferably using Azure Data Factory, Azure Databricks (Spark), Delta Lake.
· Advanced proficiency in data visualization tools such as Power BI.
Data Analysis and Visualization:
- Experience in data analysis, statistical modelling, and machine learning techniques.
- Proficiency in analytical tools like Python, R, and libraries such as Pandas, NumPy for data analysis and modelling.
- Strong expertise in Power BI, Superset, Tablue for data visualization, data modelling, and DAX queries, with knowledge of best practices.
- Experience in implementing Row-Level Security in Power BI.
- Ability to work with medium-complex data models and quickly understand application data design and processes.
- Familiar with industry best practices for Power BI and experienced in performance optimization of existing implementations.
- Understanding of machine learning algorithms, including supervised, unsupervised, and deep learning techniques.
Data Handling and Processing:
- Proficient in SQL Server and query optimization.
- Expertise in application data design and process management.
- Extensive knowledge of data modelling.
- Hands-on experience with Azure Data Factory,Azure Databricks.
- Expertise in data warehouse development, including experience with SSIS (SQL Server Integration Services) and SSAS (SQL Server Analysis Services).
- Proficiency in ETL processes (data extraction, transformation, and loading), including data cleaning and normalization.
- Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) for large-scale data processing.
Understanding of data governance, compliance, and security measures within Azure environments.

Job Title : Python Django Developer
Experience : 3+ Years
Location : Gurgaon (Work from Office)
Job Summary :
We are looking for an experienced Python Django Developer with strong expertise in building scalable web applications and distributed systems. The ideal candidate must have hands-on experience with Django, Redis, Celery, RabbitMQ, PostgreSQL, and Kafka to design and optimize high-performance applications.
Mandatory Skills :
Python, Django, Redis, Celery, RabbitMQ, PostgreSQL, Kafka
Key Responsibilities :
- Design, develop, and maintain web applications using Python & Django.
- Implement asynchronous tasks and background job processing using Celery with RabbitMQ/Redis.
- Work with PostgreSQL for database design, optimization, and complex queries.
- Integrate and optimize messaging/streaming systems using Kafka.
- Write clean, scalable, and efficient code following best practices.
- Troubleshoot, debug, and optimize application performance.
- Collaborate with cross-functional teams (frontend, DevOps, QA) for end-to-end delivery.
- Stay updated with the latest backend development trends and technologies.
Requirements :
- Minimum 3+ years of experience in backend development using Python & Django.
- Hands-on experience with Redis, Celery, RabbitMQ, Kafka, and PostgreSQL.
- Strong understanding of REST APIs, microservices architecture, and asynchronous task management.
- Knowledge of performance tuning, caching strategies, and scalable system design.
- Familiarity with Git, CI/CD pipelines, and cloud deployment (AWS/GCP/Azure) is a plus.
- Excellent problem-solving and communication skills.

Senior Software Engineer
Challenge convention and work on cutting edge technology that is transforming the way our customers manage their physical, virtual and cloud computing environments. Virtual Instruments seeks highly talented people to join our growing team, where your contributions will impact the development and delivery of our product roadmap. Our award-winning Virtana Platform provides the only real-time, system-wide, enterprise scale solution for providing visibility into performance, health and utilization metrics, translating into improved performance and availability while lowering the total cost of the infrastructure supporting mission-critical applications.
We are seeking an individual with expert knowledge in Systems Management and/or Systems Monitoring Software, Observability platforms and/or Performance Management Software and Solutions with insight into integrated infrastructure platforms like Cisco UCS, infrastructure providers like Nutanix, VMware, EMC & NetApp and public cloud platforms like Google Cloud and AWS to expand the depth and breadth of Virtana Products.
Role Responsibilities:
- The engineer will be primarily responsible for architecture, design and development of software solutions for the Virtana Platform
- Partner and work closely with cross functional teams and with other engineers and product managers to architect, design and implement new features and solutions for the Virtana Platform.
- Communicate effectively across the departments and R&D organization having differing levels of technical knowledge.
- Work closely with UX Design, Quality Assurance, DevOps and Documentation teams. Assist with functional and system test design and deployment automation
- Provide customers with complex and end-to-end application support, problem diagnosis and problem resolution
- Learn new technologies quickly and leverage 3rd party libraries and tools as necessary to expedite delivery
Required Qualifications:
- Minimum of 4-10 years of progressive experience with back-end development in a Client Server Application development environment focused on Systems Management, Systems Monitoring and Performance Management Software.
- Deep experience in public cloud environment using Kubernetes and other distributed managed services like Kafka etc (Google Cloud and/or AWS)
- Experience with CI/CD and cloud-based software development and delivery
- Deep experience with integrated infrastructure platforms and experience working with one or more data collection technologies like SNMP, REST, OTEL, WMI, WBEM.
- Minimum of 6 years of development experience with one or more of these high level languages like GO, Python, Java. Deep experience with one of these languages is required.
- Bachelor’s or Master’s degree in computer science, Computer Engineering or equivalent
- Highly effective verbal and written communication skills and ability to lead and participate in multiple projects
- Well versed with identifying opportunities and risks in a fast-paced environment and ability to adjust to changing business priorities
- Must be results-focused, team-oriented and with a strong work ethic
Desired Qualifications:
- Prior experience with other virtualization platforms like OpenShift is a plus
- Prior experience as a contributor to engineering and integration efforts with strong attention to detail and exposure to Open-Source software is a plus
- Demonstrated ability as a lead engineer who can architect, design and code with strong communication and teaming skills
- Deep development experience with the development of Systems, Network and performance Management Software and/or Solutions is a plus
About Virtana:
Virtana delivers the industry’s only broadest and deepest Observability Platform that allows organizations to monitor infrastructure, de-risk cloud migrations, and reduce cloud costs by 25% or more.
Over 200 Global 2000 enterprise customers, such as AstraZeneca, Dell, Salesforce, Geico, Costco, Nasdaq, and Boeing, have valued Virtana’s software solutions for over a decade.
Our modular platform for hybrid IT digital operations includes Infrastructure Performance Monitoring and Management (IPM), Artificial Intelligence for IT Operations (AIOps), Cloud Cost Management (Fin Ops), and Workload Placement Readiness Solutions. Virtana is simplifying the complexity of hybrid IT environments with a single cloud-agnostic platform across all the categories listed above. The $30B IT Operations Management (ITOM) Software market is ripe for disruption, and Virtana is uniquely positioned for success.

Job Title: Python Developer - Django (Full Time)
Location: Gurgaon, Onsite
Interview: Virtual Interview
Experience Required: 3+ Years
About the Role
We are looking for a skilled Python Developer with hands-on experience in building scalable backend systems. The ideal candidate should have strong expertise in Python, Django, distributed task queues using Celery, Redis, RabbitMQ, and experience working with event streaming platforms like Kafka.
Key Responsibilities
- Design, develop, and maintain backend services using Python and Django.
- Implement and optimize task queues using Celery with Redis/RabbitMQ as brokers.
- Develop and integrate event-driven systems using Apache Kafka.
- Write clean, reusable, and efficient code following best practices.
- Build RESTful APIs and integrate with external services.
- Ensure performance, scalability, and security of applications.
- Collaborate with frontend developers, DevOps, and product teams to deliver high-quality solutions.
- Troubleshoot and debug issues in production and staging environments.
Required Skills & Experience
- 2+ years of professional experience in Python backend development.
- Strong knowledge of Django Framework.
- Hands-on experience with Celery, Redis, RabbitMQ, and Kafka.
- Good understanding of REST API design principles.
- Experience with relational databases (PostgreSQL/MySQL).
- Familiarity with version control (Git) and Agile development.
- Strong problem-solving skills and ability to work in a fast-paced environment.


Role: Dot net +React Developer
Experience- 6+Years
Location- Andheri (Navi Mumbai)
Budget- 18 LPA
Opportunity - Contract
Technical Expertise:
* Proficiency in OOP concepts, C#, .NET Core, Entity Framework, React, SQL Server, PostgreSQL, Dapper, ADO.NET, LINQ, and Web API Development.
* Experience with Kafka or Rabbit MQ for event-driven architecture and messaging systems.
* Debugging and troubleshooting skills with an understanding of performance optimization.
* Strong knowledge of database development, including tables, views, stored procedures, triggers, and functions.
* Familiarity with unit testing frameworks such as XUnit.
* Experience with JWT services, Git, and third-party API integration.
* Experience in code review of Jr. developer.
Role Overview
We are seeking a motivated and technically versatile CMS Engineer (IC2) to support our transition from SharePoint to Contentful, while also contributing to the broader CMS ecosystem. This is an excellent opportunity for an early-career engineer to work on enterprise-grade platforms and microservice-based architectures.
Key Responsibilities
· 3-5 years of experience with SharePoint Online and/or enterprise CMS platforms.
· Familiarity with Contentful or other headless CMS solutions is a strong plus.
· Hands-on experience with Java, Spring Boot, and relational databases (e.g., PostgreSQL).
· Exposure to Kafka, Elasticsearch, or similar distributed technologies is desirable.
· Solid problem-solving and communication skills with an eagerness to learn.
What would you do here
SharePoint to Contentful Migration | Backend + CMS Integration
· Assist in maintaining and enhancing SharePoint Online content and features during the transition period.
· Support the migration of pages, documents, and metadata from SharePoint to Contentful.
· Contribute to the design and development of backend services that integrate with Contentful using Java, Spring Boot, and REST APIs.
· Write reusable services for content delivery, search indexing (via Elasticsearch), and event processing (via Kafka).
· Help develop APIs for CMS-based applications that interact with PostgreSQL databases.
· Troubleshoot CMS-related issues and support testing efforts during platform migration.
To be successful in this role, you should possess
• Collaborate closely with Product Management and Engineering leadership to devise and build the
right solution.
• Participate in Design discussions and brainstorming sessions to select, integrate, and maintain Big
Data tools and frameworks required to solve Big Data problems at scale.
• Design and implement systems to cleanse, process, and analyze large data sets using distributed
processing tools like Akka and Spark.
• Understanding and critically reviewing existing data pipelines, and coming up with ideas in
collaboration with Technical Leaders and Architects to improve upon current bottlenecks
• Take initiatives, and show the drive to pick up new stuff proactively, and work as a Senior
Individual contributor on the multiple products and features we have.
• 3+ years of experience in developing highly scalable Big Data pipelines.
• In-depth understanding of the Big Data ecosystem including processing frameworks like Spark,
Akka, Storm, and Hadoop, and the file types they deal with.
• Experience with ETL and Data pipeline tools like Apache NiFi, Airflow etc.
• Excellent coding skills in Java or Scala, including the understanding to apply appropriate Design
Patterns when required.
• Experience with Git and build tools like Gradle/Maven/SBT.
• Strong understanding of object-oriented design, data structures, algorithms, profiling, and
optimization.
• Have elegant, readable, maintainable and extensible code style.
You are someone who would easily be able to
• Work closely with the US and India engineering teams to help build the Java/Scala based data
pipelines
• Lead the India engineering team in technical excellence and ownership of critical modules; own
the development of new modules and features
• Troubleshoot live production server issues.
• Handle client coordination and be able to work as a part of a team, be able to contribute
independently and drive the team to exceptional contributions with minimal team supervision
• Follow Agile methodology, JIRA for work planning, issue management/tracking
Additional Project/Soft Skills:
• Should be able to work independently with India & US based team members.
• Strong verbal and written communication with ability to articulate problems and solutions over phone and emails.
• Strong sense of urgency, with a passion for accuracy and timeliness.
• Ability to work calmly in high pressure situations and manage multiple projects/tasks.
• Ability to work independently and possess superior skills in issue resolution.
• Should have the passion to learn and implement, analyze and troubleshoot issues
The candidate should have extensive experience in designing and developing scalable data pipelines and real-time data processing solutions. As a key member of the team, the Senior Data Engineer will play a critical role in building end-to-end data workflows, supporting machine learning model deployment, and driving MLOps practices in a fast-paced, agile environment. Strong expertise in Apache Kafka, Apache Flink, AWS SageMaker, and Terraform is essential. Additional experience with infrastructure automation and CI/CD for ML models is a significant advantage.
Key Responsibilities
- Design, develop, and maintain high-performance ETL and real-time data pipelines using Apache Kafka and Apache Flink.
- Build scalable and automated MLOps pipelines for training, validation, and deployment of models using AWS SageMaker and associated services.
- Implement and manage Infrastructure as Code (IaC) using Terraform to provision and manage AWS environments.
- Collaborate with data scientists, ML engineers, and DevOps teams to streamline model deployment workflows and ensure reliable production delivery.
- Optimize data storage and retrieval strategies for large-scale structured and unstructured datasets.
- Develop data transformation logic and integrate data from various internal and external sources into data lakes and warehouses.
- Monitor, troubleshoot, and enhance performance of data systems in a cloud-native, fast-evolving production setup.
- Ensure adherence to data governance, privacy, and security standards across all data handling activities.
- Document data engineering solutions and workflows to facilitate cross-functional understanding and ongoing maintenance.
- Strong proficiency in Java programming language.
- Experience with Java frameworks like Spring and Spring Boot.
- Understanding of RESTful APIs and web services.
- Experience with databases and data storage technologies (e.g., SQL, NoSQL).
- Knowledge of software development best practices, including testing and code quality.
- Experience with version control systems (e.g., Git).
- Familiarity with cloud platforms (e.g., AWS, Azure, GCP).
- Strong problem-solving and debugging skills.
- Excellent communication and collaboration skills.

The Opportunity
We’re looking for a Senior Data Engineer to join our growing Data Platform team. This role is a hybrid of data engineering and business intelligence, ideal for someone who enjoys solving complex data challenges while also building intuitive and actionable reporting solutions.
You’ll play a key role in designing and scaling the infrastructure and pipelines that power analytics, dashboards, machine learning, and decision-making across Sonatype. You’ll also be responsible for delivering clear, compelling, and insightful business intelligence through tools like Looker Studio and advanced SQL queries.
What You’ll Do
- Design, build, and maintain scalable data pipelines and ETL/ELT processes.
- Architect and optimize data models and storage solutions for analytics and operational use.
- Create and manage business intelligence reports and dashboards using tools like Looker Studio, Power BI, or similar.
- Collaborate with data scientists, analysts, and stakeholders to ensure datasets are reliable, meaningful, and actionable.
- Own and evolve parts of our data platform (e.g., Airflow, dbt, Spark, Redshift, or Snowflake).
- Write complex, high-performance SQL queries to support reporting and analytics needs.
- Implement observability, alerting, and data quality monitoring for critical pipelines.
- Drive best practices in data engineering and business intelligence, including documentation, testing, and CI/CD.
- Contribute to the evolution of our next-generation data lakehouse and BI architecture.
What We’re Looking For
Minimum Qualifications
- 5+ years of experience as a Data Engineer or in a hybrid data/reporting role.
- Strong programming skills in Python, Java, or Scala.
- Proficiency with data tools such as Databricks, data modeling techniques (e.g., star schema, dimensional modeling), and data warehousing solutions like Snowflake or Redshift.
- Hands-on experience with modern data platforms and orchestration tools (e.g., Spark, Kafka, Airflow).
- Proficient in SQL with experience in writing and optimizing complex queries for BI and analytics.
- Experience with BI tools such as Looker Studio, Power BI, or Tableau.
- Experience in building and maintaining robust ETL/ELT pipelines in production.
- Understanding of data quality, observability, and governance best practices.
Bonus Points
- Experience with dbt, Terraform, or Kubernetes.
- Familiarity with real-time data processing or streaming architectures.
- Understanding of data privacy, compliance, and security best practices in analytics and reporting.
Why You’ll Love Working Here
- Data with purpose: Work on problems that directly impact how the world builds secure software.
- Full-spectrum impact: Use both engineering and analytical skills to shape product, strategy, and operations.
- Modern tooling: Leverage the best of open-source and cloud-native technologies.
- Collaborative culture: Join a passionate team that values learning, autonomy, and real-world impact.

About the Role
We’re hiring a Data Engineer to join our Data Platform team. You’ll help build and scale the systems that power analytics, reporting, and data-driven features across the company. This role works with engineers, analysts, and product teams to make sure our data is accurate, available, and usable.
What You’ll Do
- Build and maintain reliable data pipelines and ETL/ELT workflows.
- Develop and optimize data models for analytics and internal tools.
- Work with team members to deliver clean, trusted datasets.
- Support core data platform tools like Airflow, dbt, Spark, Redshift, or Snowflake.
- Monitor data pipelines for quality, performance, and reliability.
- Write clear documentation and contribute to test coverage and CI/CD processes.
- Help shape our data lakehouse architecture and platform roadmap.
What You Need
- 2–4 years of experience in data engineering or a backend data-related role.
- Strong skills in Python or another backend programming language.
- Experience working with SQL and distributed data systems (e.g., Spark, Kafka).
- Familiarity with NoSQL stores like HBase or similar.
- Comfortable writing efficient queries and building data workflows.
- Understanding of data modeling for analytics and reporting.
- Exposure to tools like Airflow or other workflow schedulers.
Bonus Points
- Experience with DBT, Databricks, or real-time data pipelines.
- Familiarity with cloud infrastructure tools like Terraform or Kubernetes.
- Interest in data governance, ML pipelines, or compliance standards.
Why Join Us?
- Work on data that supports meaningful software security outcomes.
- Use modern tools in a cloud-first, open-source-friendly environment.
- Join a team that values clarity, learning, and autonomy.
If you're excited about building impactful software and helping others do the same, this is an opportunity to grow as a technical leader and make a meaningful impact.

🚀 Hiring: MERN Stack at Deqode
⭐ Experience: 2+ Years
📍 Location: Mumbai
⭐ Work Mode:- 5 Days Work From Office
⏱️ Notice Period: Immediate Joiners or 15 Days
(Only immediate joiners & candidates serving notice period)
MERN Stack (2+ Years of Experience) - Mumbai
🔹 Experience: 2 to 4 Years
🔹Skills: MongoDB, Express, React, Node, Docker, Kubernetes, Kafka

IT service Based
Looking for Java Developer for Gurugram and Bangalore Location with 5+ years of experience win Java + Microservices , Multithreading , Spring boot , Kafka and any MQ Series
Job Title: Backend Engineer - NodeJS, NestJS, and Python
Location: Hybrid weekly ⅔ days WFO (Bengaluru- India)
About the role:
We are looking for a skilled and passionate Senior Backend Developer to join our dynamic team. The ideal candidate should have strong experience in Node.js and NestJS, along with a solid understanding of database management, query optimization, and microservices architecture. As a backend developer, you will be responsible for developing and maintaining scalable backend systems, building robust APIs, integrating databases, and working closely with frontend and DevOps teams to deliver high-quality software solutions.
What You'll Do 🛠️
- Design, develop, and maintain server-side logic using Node.js, NestJS, and Python.
- Develop and integrate RESTful APIs and microservices to support scalable systems.
- Work with NoSQL and SQL databases (e.g., MongoDB, PostgreSQL, MySQL) to create and manage schemas, write complex queries, and optimize performance.
- Collaborate with cross-functional teams including frontend, DevOps, and QA.
- Ensure code quality, maintainability, and scalability through code reviews, testing, and documentation.
- Monitor and troubleshoot production systems, ensuring high availability and performance.
- Implement security and data protection best practices.
What You'll Bring 💼
- 4 to 6 years of professional experience as a backend developer.
- Strong proficiency in Node.js and NestJS framework.
- Good hands-on experience with Python (Django/Flask experience is a plus).
- Solid understanding of relational and non-relational databases.
- Proficient in writing complex NoSQL queries and SQL queries
- Experience with microservices architecture and distributed systems.
- Familiarity with version control systems like Git.
- Basic understanding of containerization (e.g., Docker) and cloud services is a plus.
- Excellent problem-solving skills and a collaborative mindset.
Bonus Points ➕
- Experience with CI/CD pipelines.
- Exposure to cloud platforms like AWS, GCP or Azure.
- Familiarity with event-driven architecture or message brokers (MQTT, Kafka, RabbitMQ)
Why this role matters
You will help build the company from the ground up—shaping our culture and having an impact from Day 1 as part of the foundational team.

Job Title: Backend Engineer - NodeJS, NestJS, and Python
Location: Hybrid weekly ⅔ days WFO (Bengaluru- India)
About the role:
We are looking for a skilled and passionate Senior Backend Developer to join our dynamic team. The ideal candidate should have strong experience in Node.js and NestJS, along with a solid understanding of database management, query optimization, and microservices architecture. As a backend developer, you will be responsible for developing and maintaining scalable backend systems, building robust APIs, integrating databases, and working closely with frontend and DevOps teams to deliver high-quality software solutions.
What You'll Do 🛠️
- Design, develop, and maintain server-side logic using Node.js, NestJS, and Python.
- Develop and integrate RESTful APIs and microservices to support scalable systems.
- Work with NoSQL and SQL databases (e.g., MongoDB, PostgreSQL, MySQL) to create and manage schemas, write complex queries, and optimize performance.
- Collaborate with cross-functional teams including frontend, DevOps, and QA.
- Ensure code quality, maintainability, and scalability through code reviews, testing, and documentation.
- Monitor and troubleshoot production systems, ensuring high availability and performance.
- Implement security and data protection best practices.
What You'll Bring 💼
- 4 to 6 years of professional experience as a backend developer.
- Strong proficiency in Node.js and NestJS framework.
- Good hands-on experience with Python (Django/Flask experience is a plus).
- Solid understanding of relational and non-relational databases.
- Proficient in writing complex NoSQL queries and SQL queries
- Experience with microservices architecture and distributed systems.
- Familiarity with version control systems like Git.
- Basic understanding of containerization (e.g., Docker) and cloud services is a plus.
- Excellent problem-solving skills and a collaborative mindset.
Bonus Points ➕
- Experience with CI/CD pipelines.
- Exposure to cloud platforms like AWS, GCP or Azure.
- Familiarity with event-driven architecture or message brokers (MQTT, Kafka, RabbitMQ)
Why this role matters
You will help build the company from the ground up—shaping our culture and having an impact from Day 1 as part of the foundational team.
Experience in Core Java and Spring Boot.
• Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.
• Should have the ability to analyze, design, develop and test complex, low-latency clientfacing applications.
• Good development experience with RDBMS.
• Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux.
• Excellent problem solving and coding skills.
• Strong interpersonal, communication and analytical skills.
• Should have the ability to express their design ideas and thoughts
Java Developer – Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting. Required Skills: • Exp. - 4 to 7 years. • Experience in Core Java and Spring Boot. • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills. • Strong interpersonal, communication and analytical skills. • Should have the ability to express their design ideas and thoughts. About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.

Role Overview:
We are seeking a Senior Software Engineer (SSE) with strong expertise in Kafka, Python, and Azure Databricks to lead and contribute to our healthcare data engineering initiatives. This role is pivotal in building scalable, real-time data pipelines and processing large-scale healthcare datasets in a secure and compliant cloud environment.
The ideal candidate will have a solid background in real-time streaming, big data processing, and cloud platforms, along with strong leadership and stakeholder engagement capabilities.
Key Responsibilities:
- Design and develop scalable real-time data streaming solutions using Apache Kafka and Python.
- Architect and implement ETL/ELT pipelines using Azure Databricks for both structured and unstructured healthcare data.
- Optimize and maintain Kafka applications, Python scripts, and Databricks workflows to ensure performance and reliability.
- Ensure data integrity, security, and compliance with healthcare standards such as HIPAA and HITRUST.
- Collaborate with data scientists, analysts, and business stakeholders to gather requirements and translate them into robust data solutions.
- Mentor junior engineers, perform code reviews, and promote engineering best practices.
- Stay current with evolving technologies in cloud, big data, and healthcare data standards.
- Contribute to the development of CI/CD pipelines and containerized environments (Docker, Kubernetes).
Required Skills & Qualifications:
- 4+ years of hands-on experience in data engineering roles.
- Strong proficiency in Kafka (including Kafka Streams, Kafka Connect, Schema Registry).
- Proficient in Python for data processing and automation.
- Experience with Azure Databricks (or readiness to ramp up quickly).
- Solid understanding of cloud platforms, with a preference for Azure (AWS/GCP is a plus).
- Strong knowledge of SQL and NoSQL databases; data modeling for large-scale systems.
- Familiarity with containerization tools like Docker and orchestration using Kubernetes.
- Exposure to CI/CD pipelines for data applications.
- Prior experience with healthcare datasets (EHR, HL7, FHIR, claims data) is highly desirable.
- Excellent problem-solving abilities and a proactive mindset.
- Strong communication and interpersonal skills to work in cross-functional teams.
Java Developer – Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting. Required Skills: • Exp. - 4 to 7 years. • Experience in Core Java and Spring Boot. • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills. • Strong interpersonal, communication and analytical skills. • Should have the ability to express their design ideas and thoughts. About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.


Employment type- Contract basis
Key Responsibilities
- Design, develop, and maintain scalable data pipelines using PySpark and distributed computing frameworks.
- Implement ETL processes and integrate data from structured and unstructured sources into cloud data warehouses.
- Work across Azure or AWS cloud ecosystems to deploy and manage big data workflows.
- Optimize performance of SQL queries and develop stored procedures for data transformation and analytics.
- Collaborate with Data Scientists, Analysts, and Business teams to ensure reliable data availability and quality.
- Maintain documentation and implement best practices for data architecture, governance, and security.
⚙️ Required Skills
- Programming: Proficient in PySpark, Python, and SQL, MongoDB
- Cloud Platforms: Hands-on experience with Azure Data Factory, Databricks, or AWS Glue/Redshift.
- Data Engineering Tools: Familiarity with Apache Spark, Kafka, Airflow, or similar tools.
- Data Warehousing: Strong knowledge of designing and working with data warehouses like Snowflake, BigQuery, Synapse, or Redshift.
- Data Modeling: Experience in dimensional modeling, star/snowflake schema, and data lake architecture.
- CI/CD & Version Control: Exposure to Git, Terraform, or other DevOps tools is a plus.
🧰 Preferred Qualifications
- Bachelor's or Master's in Computer Science, Engineering, or related field.
- Certifications in Azure/AWS are highly desirable.
- Knowledge of business intelligence tools (Power BI, Tableau) is a bonus.
Job Description:
We are looking for a Senior Java Developer with strong expertise in Apache Kafka and backend systems. The ideal candidate will have hands-on experience in Java (8/11+), Spring Boot, and building scalable, real-time data pipelines using Kafka.
Key Responsibilities:
- Develop and maintain backend services using Java and Spring Boot
- Design and implement Kafka-based messaging and streaming solutions
- Optimize Kafka performance (topics, partitions, consumers)
- Collaborate with cross-functional teams to deliver scalable microservices
- Ensure code quality and maintain best practices in a distributed environment
Required Skills:
- 6+ years in Java development
- 3+ years of hands-on Kafka experience (producers, consumers, streams)
- Strong knowledge of Spring Boot, REST APIs, and microservices
- Familiarity with Kafka Connect, Schema Registry, and stream processing
- Experience with containerization (Docker), CI/CD, and cloud platforms (AWS/GCP/Azure)

Senior Data Engineer Job Description
Overview
The Senior Data Engineer will design, develop, and maintain scalable data pipelines and
infrastructure to support data-driven decision-making and advanced analytics. This role requires deep
expertise in data engineering, strong problem-solving skills, and the ability to collaborate with
cross-functional teams to deliver robust data solutions.
Key Responsibilities
Data Pipeline Development: Design, build, and optimize scalable, secure, and reliable data
pipelines to ingest, process, and transform large volumes of structured and unstructured data.
Data Architecture: Architect and maintain data storage solutions, including data lakes, data
warehouses, and databases, ensuring performance, scalability, and cost-efficiency.
Data Integration: Integrate data from diverse sources, including APIs, third-party systems,
and streaming platforms, ensuring data quality and consistency.
Performance Optimization: Monitor and optimize data systems for performance, scalability,
and cost, implementing best practices for partitioning, indexing, and caching.
Collaboration: Work closely with data scientists, analysts, and software engineers to
understand data needs and deliver solutions that enable advanced analytics, machine
learning, and reporting.
Data Governance: Implement data governance policies, ensuring compliance with data
security, privacy regulations (e.g., GDPR, CCPA), and internal standards.
Automation: Develop automated processes for data ingestion, transformation, and validation
to improve efficiency and reduce manual intervention.
Mentorship: Guide and mentor junior data engineers, fostering a culture of technical
excellence and continuous learning.
Troubleshooting: Diagnose and resolve complex data-related issues, ensuring high
availability and reliability of data systems.
Required Qualifications
Education: Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science,
or a related field.
Experience: 5+ years of experience in data engineering or a related role, with a proven track
record of building scalable data pipelines and infrastructure.
Technical Skills:
Proficiency in programming languages such as Python, Java, or Scala.
Expertise in SQL and experience with NoSQL databases (e.g., MongoDB, Cassandra).
Strong experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services
(e.g., Redshift, BigQuery, Snowflake).
Hands-on experience with ETL/ELT tools (e.g., Apache Airflow, Talend, Informatica) and
data integration frameworks.
Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) and distributed
systems.
Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes) is a
plus.
Soft Skills:
Excellent problem-solving and analytical skills.
Strong communication and collaboration abilities.
Ability to work in a fast-paced, dynamic environment and manage multiple priorities.
Certifications (optional but preferred): Cloud certifications (e.g., AWS Certified Data Analytics,
Google Professional Data Engineer) or relevant data engineering certifications.
Preferred Qualifica
Experience with real-time data processing and streaming architectures.
Familiarity with machine learning pipelines and MLOps practices.
Knowledge of data visualization tools (e.g., Tableau, Power BI) and their integration with data
pipelines.
Experience in industries with high data complexity, such as finance, healthcare, or
e-commerce.
Work Environment
Location: Hybrid/Remote/On-site (depending on company policy).
Team: Collaborative, cross-functional team environment with data scientists, analysts, and
business stakeholders.
Hours: Full-time, with occasional on-call responsibilities for critical data systems.
Hiring for Java Developer
Experience : 5 to 10 yrs
Notice Period : 0 to 15 days
Location : Pune
Work Mode : WFO (5 days)
As Java developer you would be expected to perform many duties throughout the development lifecycle of applications, from concept and design right through to testing. Here are some of the responsibilities you may have:
Develop high-level design and define software architecture
Implement and maintain quality systems within the group
Proficiently estimates, design approaches and nimbly move to alternate apporaches, if needed, develop and execute unit test strategies
Monitor and track tasks, and report status
Assist project heads to conceptualize, design, develop, test and implement technology solutions
Effectively collaborate with stakeholders and users to ensure customer satisfaction
Skill Set :
Java 7 / Java 8 with microservices, Multithreading, Springboot, Junit, kafka, Splunk (Good to have), Open Shift (Good to Have), Authentication/ Spring Security (Good to have)
Job Title : Red Hat PAM Developer
Experience Required :
- Relevant Experience : 4+ Years
- Total Experience : Up to 8 Years
No. of Positions : 4
Work Locations : Hyderabad / Bangalore / Mumbai / Pune / Gurgaon / Chennai
Work Mode : Hybrid
Work Timings : 1:00 PM to 10:00 PM IST
Interview Mode : Virtual
Interview Rounds : 2
Mandatory Skills :
- Excellent communication skills – must be comfortable in client-facing roles
- Red Hat Process Automation Manager (PAM)
- JBPM (Java Business Process Management)
- BPMN 2.0 (Business Process Model and Notation) – low-code platform
- DMN (Decision Model and Notation) – business processes and rules
- Spring Boot
- JavaScript
Good-to-Have Skills :
- Red Hat Fuse
- Apache Kafka
- Apache Camel
Java Developer – Job Description
Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting. Required Skills: • Exp. - 5 to 12 years. • Experience in Core Java and Spring Boot. • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills. • Strong interpersonal, communication and analytical skills. • Should have the ability to express their design ideas and thoughts. About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.

About the Role:
We are seeking a talented Lead Data Engineer to join our team and play a pivotal role in transforming raw data into valuable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and infrastructure to support our organization's analytics and decision-making processes.
Responsibilities:
- Data Pipeline Development: Build and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes.
- Data Infrastructure: Design, implement, and manage data infrastructure components, including data warehouses, data lakes, and data marts.
- Data Quality: Ensure data quality by implementing data validation, cleansing, and standardization processes.
- Team Management: Able to handle team.
- Performance Optimization: Optimize data pipelines and infrastructure for performance and efficiency.
- Collaboration: Collaborate with data analysts, scientists, and business stakeholders to understand their data needs and translate them into technical requirements.
- Tool and Technology Selection: Evaluate and select appropriate data engineering tools and technologies (e.g., SQL, Python, Spark, Hadoop, cloud platforms).
- Documentation: Create and maintain clear and comprehensive documentation for data pipelines, infrastructure, and processes.
Skills:
- Strong proficiency in SQL and at least one programming language (e.g., Python, Java).
- Experience with data warehousing and data lake technologies (e.g., Snowflake, AWS Redshift, Databricks).
- Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and cloud-based data services.
- Understanding of data modeling and data architecture concepts.
- Experience with ETL/ELT tools and frameworks.
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
Preferred Qualifications:
- Experience with real-time data processing and streaming technologies (e.g., Kafka, Flink).
- Knowledge of machine learning and artificial intelligence concepts.
- Experience with data visualization tools (e.g., Tableau, Power BI).
- Certification in cloud platforms or data engineering.

Job Title : Python Django Developer
Experience : 3+ Years
Location : Gurgaon
Working Days : 6 Days (Monday to Saturday)
Job Summary :
We are looking for a skilled Python Django Developer with strong foundational knowledge in backend development, data structures, and operating system concepts.
The ideal candidate should have experience in Django and PostgreSQL, along with excellent logical thinking and multithreading knowledge.
Technical Skills : Python, Django (or Flask), PostgreSQL/MySQL, SQL & NoSQL ORM, REST API development, JSON/XML, strong knowledge of data structures, multithreading, and OS concepts.
Key Responsibilities :
- Write efficient, reusable, testable, and scalable code using the Django framework.
- Develop backend components, server-side logic, and statistical models.
- Design and implement high-availability, low-latency applications with robust data protection and security.
- Contribute to the development of highly responsive web applications.
- Collaborate with cross-functional teams on system design and integration.
Mandatory Skills :
- Strong programming skills in Python and Django (or similar frameworks like Flask).
- Proficiency with PostgreSQL / MySQL and experience in writing complex queries.
- Strong understanding of SQL and NoSQL ORM.
- Solid grasp of data structures, multithreading, and operating system concepts.
- Experience with RESTful API development and implementation of API security.
- Knowledge of JSON/XML and their use in data exchange.
Good-to-Have Skills :
- Experience with Redis, MQTT, and message queues like RabbitMQ or Kafka
- Understanding of microservice architecture and third-party API integrations (e.g., payment gateways, SMS/email APIs)
- Familiarity with MongoDB and other NoSQL databases
- Exposure to data science libraries such as Pandas, NumPy, Scikit-learn
- Knowledge in building and integrating statistical learning models.
Job Role: We are seeking a skilled Java Developer to contribute to the development and enhancement renowned banking application, which supports automatic reconciliation and unified data reporting for their clients. This role involves working on high-impact enhancements, data pipeline integration, and platform modernization. The ideal candidate will be a quick learner, self-motivated, and able to ramp up quickly in a fast-paced environment.
Key Responsibilities:
Design, develop, and maintain Java-based applications using Java 17 and Spring Boot.
Implement and manage message routing using Apache Camel.
Develop and monitor data pipelines using Kafka.
Support and enhance existing cloud-native applications.
Work with OpenShift Container Platform (OCP 4) for container orchestration and deployments.
Utilize Jenkins for CI/CD pipeline automation and management.
Collaborate with cross-functional teams to integrate multiple data sources into a unified reporting platform.
Participate in code reviews, unit testing, and performance tuning.
Troubleshoot and resolve production issues in collaboration with operations teams.
Document development processes and system configurations.
Required Skills:
Strong proficiency in Java 17 and Spring Boot frameworks.
Hands-on experience with Apache Camel for message routing and transformation.
Solid experience in Kafka development and monitoring tools.
Good understanding of cloud pipeline architectures and deployment strategies.
Experience working with OpenShift (OCP 4).
Familiarity with Jenkins for CI/CD and automated deployments.
Understanding of cloud deployment platforms (AWS, Azure, or GCP preferred).
Strong analytical and debugging skills.
Ability to learn quickly and adapt to evolving project requirements.
Nice to Have:
Experience in financial services or transaction reporting platforms.
Familiarity with microservices architecture and containerization best practices.
Knowledge of monitoring tools (e.g., Prometheus, Grafana).

Company Overview
We are a dynamic startup dedicated to empowering small businesses through innovative technology solutions. Our mission is to level the playing field for small businesses by providing them with powerful tools to compete effectively in the digital marketplace. Join us as we revolutionize the way small businesses operate online, bringing innovation and growth to local communities.
Job Description
We are seeking a skilled and experienced Data Engineer to join our team. In this role, you will develop systems on cloud platforms capable of processing millions of interactions daily, leveraging the latest cloud computing and machine learning technologies while creating custom in-house data solutions. The ideal candidate should have hands-on experience with SQL, PL/SQL, and any standard ETL tools. You must be able to thrive in a fast-paced environment and possess a strong passion for coding and problem-solving.
Required Skills and Experience
- Minimum 5 years of experience in software development.
- 3+ years of experience in data management and SQL expertise – PL/SQL, Teradata, and Snowflake experience strongly preferred.
- Expertise in big data technologies such as Hadoop, HiveQL, and Spark (Scala/Python).
- Expertise in cloud technologies – AWS (S3, Glue, Terraform, Lambda, Aurora, Redshift, EMR).
- Experience with queuing systems (e.g., SQS, Kafka) and caching systems (e.g., Ehcache, Memcached).
- Experience with container management tools (e.g., Docker Swarm, Kubernetes).
- Familiarity with data stores, including at least one of the following: Postgres, MongoDB, Cassandra, or Redis.
- Ability to create advanced visualizations and dashboards to communicate complex findings (e.g., Looker Studio, Power BI, Tableau).
- Strong skills in manipulating and transforming complex datasets for in-depth analysis.
- Technical proficiency in writing code in Python and advanced SQL queries.
- Knowledge of AI/ML infrastructure, best practices, and tools is a plus.
- Experience in analyzing and resolving code issues.
- Hands-on experience with software architecture concepts such as Separation of Concerns (SoC) and micro frontends with theme packages.
- Proficiency with the Git version control system.
- Experience with Agile development methodologies.
- Strong problem-solving skills and the ability to learn quickly.
- Exposure to Docker and Kubernetes.
- Familiarity with AWS or other cloud platforms.
Responsibilities
- Develop and maintain our inhouse search and reporting platform
- Create data solutions to complement core products to improve performance and data quality
- Collaborate with the development team to design, develop, and maintain our suite of products.
- Write clean, efficient, and maintainable code, adhering to coding standards and best practices.
- Participate in code reviews and testing to ensure high-quality code.
- Troubleshoot and debug application issues as needed.
- Stay up-to-date with emerging trends and technologies in the development community.
How to apply?
- If you are passionate about designing user-centric products and want to be part of a forward-thinking company, we would love to hear from you. Please send your resume, a brief cover letter outlining your experience and your current CTC (Cost to Company) as a part of the application.
Join us in shaping the future of e-commerce!