Cutshort logo
Apache kafka jobs

50+ Apache Kafka Jobs in India

Apply to 50+ Apache Kafka Jobs on CutShort.io. Find your next job, effortlessly. Browse Apache Kafka Jobs and apply today!

icon
Its Global MNC, into Data Management industry.

Its Global MNC, into Data Management industry.

Agency job
via Techno Wise by Happy D
Hyderabad
9 - 12 yrs
₹20L - ₹32L / yr
skill iconJava
skill iconAmazon Web Services (AWS)
Apache Kafka
skill icongrafana
Spring
+2 more

Technical Lead – DevOps Engineer


Job Summary:


We are seeking a seasoned Technical Lead with 9–10 years of hands-on experience in designing and delivering scalable enterprise solutions. The ideal candidate will be an expert in container technologies, real-time data streaming, DevOps practices, and performance optimization. You will play a key leadership role in architecting robust systems, guiding teams, and driving continuous improvement in both development and operational processes.


Key Responsibilities:


Design scalable enterprise-grade solutions using Open Shift, Kafka, and container based architectures.

Lead container migrations from legacy systems to modern, cloud-native infrastructure.

Ensure system performance, high availability, and security across enterprise platforms.

Take ownership of end-to-end architecture and DevOps integration, including CI/CD pipeline automation.

Drive compliance, governance, and best practices in deployment and release processes.

Mentor development teams on performance tuning, observability, and sustainable coding strategies.

Collaborate with product managers, QA, and infrastructure teams to ensure alignment and delivery excellence.

Evaluate and implement tools for monitoring, alerting, and logging to improve operational visibility and reliability.

Conduct technical reviews and performance audits to continuously enhance platform stability and speed.


Must-Have Skills:


Proven experience designing scalable and secure solutions using Open Shift. Hands-on expertise in Apache Kafka for real-time data streaming and event driven architectures.

Strong background in containerization and container orchestration (Docker, Kubernetes, Open Shift).

Experience leading container migrations, optimizing system performance, security, and availability.

Solid knowledge of CI/CD automation, with tools like Jenkins, Galba CI/CD. Expertise in DevOps practices, including infrastructure-as-code (IMac), continuous monitoring, and automated deployments.

Strong mentoring skills and experience coaching teams on observability, tuning, and architectural best practices.


Good-to-Have Skills:


Experience with cloud platforms (AWS).

Knowledge of micro services and event-driven design patterns.

Hands-on with monitoring tools Grafana, ELK Stack. Experience implementing security best practices.

Working knowledge of Agile/Scrum methodologies.

Certification in Kubernetes, Open Shift, or Cloud platforms is a plus.

Read more
MARS Telecom Systems

at MARS Telecom Systems

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Hyderabad
4 - 7 yrs
Upto ₹18L / yr (Varies
)
skill iconJava
RESTful APIs
skill iconSpring Boot
Microservices
Apache Kafka
+1 more

Performs complex software engineering assignments following designated standards and procedures. Provides technical guidance and leadership, and mentors more junior members of the team. Conceptualizes, designs, codes, debugs and performs development activities in accordance with designated standards and procedures. Works closely with other engineering disciplines. This position typically works under general supervision and direction. Incumbents of this position will regularly exercise discretionary and substantial decision-making authority.


Essential Job Duties and Responsibilities: 

  • Coordinates and develops project concepts, objectives, specifications and resource needs. Prepares design specifications, analyses and recommendations.
  • Use current programming language and technologies to provide creative, thorough and practical solutions to a wide range of technical problems
  • Design develop and test applications and programs to support the company’s products
  • Design develop and test software programs following established quality standards and in accordance with internal engineering procedures including coding, unit testing and software configuration control
  • Complete high- and low-level detailed software design specifications, storyboards and interface specifications
  • Provide support of products through conception to product delivery including problem solving, defect maintenance and support to customer services
  • Prepare reports, manuals, procedures and status reports
  • Participate and work with team members in code reviews and make necessary improvements in code
  • Coaches and mentors junior team members
  • Keeps abreast of improvements and developments within software engineering
  • Supports bid and proposals and customer variation requests
  • Supports and coaches more junior members of the team

Qualifications

  • Four-year college degree in computer science, computer engineering or other related technical discipline.


Skills/Experience/Knowledge:

  • 3 to 5 years related experience.
  • Experience with, and understanding of, the software development life-cycle
  • Experience debugging and troubleshooting
  • Experience working within Agile/Scrum methodologies
  • Strong in Java SE and Multi-Threaded programming
  • Must have experience in exposing web services using JAX-WS/REST (one-two years during the last five years of experience)
  • ​Hands on experience in Spring and Hibernate (one-two years during the last five years of experience)
  • Hands on experience in Angular or ReactJS
  • Good in SQL
  • AWS knowledge is mandatory; candidates with AWS certification will be given strong preference.
  • Experience in high and low level design
  • Experience on any Enterprise Service Bus (ESB)
  • Experience on Spring Boot and Spring Data
  • Experience on UI development using JS libraries like Angular
  • Prior experience in product development

Personal Qualities:

  • Effective written and oral communication skills
  • Excellent problem-solving skills
  • Team player
  • Able to prioritize work, complete multiple tasks and work under deadline and budget guidelines.
  • May be required to travel domestically and internationally to include working odd hours, in-line with customer requirements

Mandate Skill

  • Experience with, and understanding of, the software development life-cycle
  • Experience debugging and troubleshooting
  • Experience working within Agile/Scrum methodologies
  • Strong in Java SE and Multi-Threaded programming
  • Must have experience in exposing web services using JAX-WS/REST (one-two years during the last five years of experience)
  • Hands on experience in Spring and Hibernate (one-two years during the last five years of experience)
  • Hands on experience in Angular or ReactJS
  • Good in SQL
  • AWS knowledge is mandatory; candidates with AWS certification will be given strong preference.
  • Experience on Spring Boot and Spring Data
  • Experience on UI development using JS libraries like Angular
  • Prior experience in product development

Optional/Nice to have skills

  • Experience in high and low level design
  • Experience on any Enterprise Service Bus (ESB)
  • Effective written and oral communication skills
  • Excellent problem-solving skills
  • Team player
  • Able to prioritize work, complete multiple tasks and work under deadline and budget guidelines.
Read more
Publicis Sapient

at Publicis Sapient

10 recruiters
Dipika
Posted by Dipika
Bengaluru (Bangalore), Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Hyderabad, Pune
5 - 7 yrs
₹5L - ₹20L / yr
skill iconJava
Microservices
06692
Apache Kafka
Apache ActiveMQ
+3 more

1 Senior Associate Technology L1 – Java Microservices


Company Description

Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value.


Job Description

We are looking for a Senior Associate Technology Level 1 - Java Microservices Developer to join our team of bright thinkers and doers. You’ll use your problem-solving creativity to design, architect, and develop high-end technology solutions that solve our clients’ most complex and challenging problems across different industries.

We are on a mission to transform the world, and you will be instrumental in shaping how we do it with your ideas, thoughts, and solutions.


Your Impact:

• Drive the design, planning, and implementation of multifaceted applications, giving you breadth and depth of knowledge across the entire project lifecycle.

• Combine your technical expertise and problem-solving passion to work closely with clients, turning • complex ideas into end-to-end solutions that transform our clients’ business

• Constantly innovate and evaluate emerging technologies and methods to provide scalable and elegant solutions that help clients achieve their business goals.


Qualifications

➢ 5 to 7 Years of software development experience

➢ Strong development skills in Java JDK 1.8 or above

➢ Java fundamentals like Exceptional handling, Serialization/Deserialization and Immutability concepts

➢ Good fundamental knowledge in Enums, Collections, Annotations, Generics, Auto boxing and Data Structure

➢ Database RDBMS/No SQL (SQL, Joins, Indexing)

➢ Multithreading (Re-entrant Lock, Fork & Join, Sync, Executor Framework)

➢ Spring Core & Spring Boot, security, transactions ➢ Hands-on experience with JMS (ActiveMQ, RabbitMQ, Kafka etc)

➢ Memory Mgmt (JVM configuration, Profiling, GC), profiling, Perf tunning, Testing, Jmeter/similar tool)

➢ Devops (CI/CD: Maven/Gradle, Jenkins, Quality plugins, Docker and containersization)

➢ Logical/Analytical skills. Thorough understanding of OOPS concepts, Design principles and implementation of

➢ different type of Design patterns. ➢ Hands-on experience with any of the logging frameworks (SLF4J/LogBack/Log4j) ➢ Experience of writing Junit test cases using Mockito / Powermock frameworks.

➢ Should have practical experience with Maven/Gradle and knowledge of version control systems like Git/SVN etc.

➢ Good communication skills and ability to work with global teams to define and deliver on projects.

➢ Sound understanding/experience in software development process, test-driven development.

➢ Cloud – AWS / AZURE / GCP / PCF or any private cloud would also be fine

➢ Experience in Microservices

Read more
Victrix Systems  Labs

at Victrix Systems Labs

1 recruiter
Vijayalaxmi Yadav
Posted by Vijayalaxmi Yadav
Pune
6 - 9 yrs
₹20L - ₹30L / yr
Apache Kafka
skill iconElastic Search
skill iconAmazon Web Services (AWS)
skill iconJava
skill iconSpring Boot
+1 more

Role & Responsibilities :


- Lead the design, analysis, and implementation of technical solutions.


- Take full ownership of product features.


- Participate in detailed discussions with the product management team regarding requirements.


- Work closely with the engineering team to design and implement scalable solutions.


- Create detailed functional and technical specifications.


- Follow Test-Driven Development (TDD) and deliver high-quality code.


- Communicate proactively with your manager regarding risks and progress.


- Mentor junior team members and provide technical guidance.


- Troubleshoot and resolve production issues with RCA and long-term solutions


Required Skills & Experience :


- Bachelors/Masters degree in Computer Science or related field with a solid academic track record.


- 6+ years of hands-on experience in backend development for large-scale enterprise products.


- Strong programming skills in Java; familiarity with Python is a plus.


- Deep understanding of data structures, algorithms, and problem-solving.


- Proficient in Spring Boot and RESTful APIs.


- Experience with cloud technologies like ElasticSearch, Kafka, MongoDB, Hazelcast, Ceph, etc.


- Strong experience in building scalable, concurrent applications.


- Exposure to Service-Oriented Architecture (SOA) and Test-Driven Development (TDD).


- Excellent communication and collaboration skills.


Preferred Technologies :


- Java


- Spring Boot, J2EE


- ElasticSearch


- Kafka


- MongoDB, Ceph


- AWS


- Storm, Hazelcast


- TDD, SOA



Read more
iRage

at iRage

3 recruiters
Jyosana Jadhav
Posted by Jyosana Jadhav
Mumbai
4 - 7 yrs
₹25L - ₹40L / yr
skill iconReact.js
skill iconJavascript
DOM
WebSocket
Chart.js
+20 more

We are seeking a highly skilled React JS Developer with exceptional DOM manipulation expertise and real-time data handling experience to join our team. You'll be building and optimizing high-performance user interfaces for stock market trading applications where milliseconds matter and data flows continuously.


The ideal candidate thrives in fast-paced environments, understands the intricacies of browser performance, and has hands-on experience with WebSockets and real-time data streaming architectures.


Key Responsibilities


Core Development

  • Advanced DOM Operations: Implement complex, performance-optimized DOM manipulations for real-time trading interfaces
  • Real-time Data Management: Build robust WebSocket connections and handle high-frequency data streams with minimal latency
  • Performance Engineering: Create lightning-fast, scalable front-end applications that process thousands of market updates per second
  • Custom Component Architecture: Design and build reusable, high-performance React components optimized for trading workflows


Collaboration & Integration

  • Work closely with traders, quants, and backend developers to translate complex trading requirements into intuitive interfaces
  • Collaborate with UX/UI designers and product managers to create responsive, trader-focused experiences
  • Integrate with real-time market data APIs and trading execution systems


Technical Excellence

  • Implement sophisticated data visualizations and interactive charts using libraries like Chartjs, TradingView, or custom D3.js solutions
  • Ensure cross-browser compatibility and responsiveness across multiple devices and screen sizes
  • Debug and resolve complex performance issues, particularly in real-time data processing and rendering
  • Maintain high-quality code through reviews, testing, and comprehensive documentation


Required Skills & Experience


React & JavaScript Mastery

  • 5+ years of professional React.js development with deep understanding of React internals, hooks, and advanced patterns
  • Expert-level JavaScript (ES6+) with strong proficiency in asynchronous programming, closures, and memory management
  • Advanced HTML5 & CSS3 skills with focus on performance and cross-browser compatibility


Real-time & Performance Expertise

  • Proven experience with WebSockets and real-time data streaming protocols
  • Strong DOM manipulation skills - direct DOM access, virtual scrolling, efficient updates, and performance optimization
  • RESTful API integration with experience in handling high-frequency data feeds
  • Browser performance optimization - understanding of rendering pipeline, memory management, and profiling tools


Development Tools & Practices

  • Proficiency with modern build tools: Webpack, Babel, Vite, or similar
  • Experience with Git version control and collaborative development workflows
  • Agile/Scrum development environment experience
  • Understanding of testing frameworks (Jest, React Testing Library)


Financial Data Visualization

  • Experience with financial charting libraries: Chartjs, TradingView, D3.js, or custom visualization solutions
  • Understanding of market data structures, order books, and trading terminology
  • Knowledge of data streaming optimization techniques for financial applications


Nice-to-Have Skills


Domain Expertise

  • Prior experience in stock market, trading, or financial services - understanding of trading workflows, order management, risk systems
  • Algorithmic trading knowledge or exposure to quantitative trading systems
  • Financial market understanding - equities, derivatives, commodities


Technical Plus Points

  • Backend development experience with GoLang, Python, or Node.js
  • Database knowledge: SQL, NoSQL, time-series databases (InfluxDB, TimescaleDB)
  • Cloud platform experience: AWS, Azure, GCP for deploying scalable applications
  • Message queue systems: Redis, RabbitMQ, Kafka, NATS for real-time data processing
  • Microservices architecture understanding and API design principles


Advanced Skills

  • Service Worker implementation for offline-first applications
  • Progressive Web App (PWA) development
  • Mobile-first responsive design expertise


Qualifications

  • Bachelor's degree in Computer Science, Engineering, or related field (or equivalent professional experience)
  • 5+ years of professional React.js development with demonstrable experience in performance-critical applications
  • Portfolio or examples of complex real-time applications you've built
  • Financial services experience strongly preferred


Why You'll Love Working Here


We're a team that hustles—plain and simple. But we also believe life outside work matters. No cubicles, no suits—just great people doing great work in a space built for comfort and creativity.


What We Offer

💰 Competitive salary – Get paid what you're worth

🌴 Generous paid time off – Recharge and come back sharper

🌍 Work with the best – Collaborate with top-tier global talent

✈️ Adventure together – Annual offsites (mostly outside India) and regular team outings

🎯 Performance rewards – Multiple bonuses for those who go above and beyond

🏥 Health covered – Comprehensive insurance so you're always protected

Fun, not just work – On-site sports, games, and a lively workspace

🧠 Learn and lead – Regular knowledge-sharing sessions led by your peers

📚 Annual Education Stipend – Take any external course, bootcamp, or certification that makes you better at your craft

🏋️ Stay fit – Gym memberships with equal employer contribution to keep you at your best

🚚 Relocation support – Smooth move? We've got your back

🏆 Friendly competition – Work challenges and extracurricular contests to keep things exciting


We work hard, play hard, and grow together. Join us.



Read more
NonStop io Technologies Pvt Ltd
Kalyani Wadnere
Posted by Kalyani Wadnere
Pune
6 - 8 yrs
Best in industry
Salesforce
Salesforce Apex
Lightning Web Component
Salesforce API
Sales Cloud
+7 more

Key Responsibilities:

  • Design, build, and enhance Salesforce applications using Apex, Lightning Web Components (LWC), Visualforce, and SOQL.
  • Implement integrations with external systems using REST APIs and event-driven messaging (e.g., Kafka).
  • Collaborate with architects and business analysts to translate requirements into scalable, maintainable solutions.
  • Establish and follow engineering best practices, including source control (Git), code reviews, branching strategies, CI/CD pipelines, automated testing, and environment management.
  • Establish and maintain Azure DevOps-based workflows (repos, pipelines, automated testing) for Salesforce engineering.
  • Ensure solutions follow Salesforce security, data modeling, and performance guidelines.
  • Participate in Agile ceremonies, providing technical expertise and leadership within sprints and releases.
  • Optimize workflows, automations, and data processes across Sales Cloud, Service Cloud, and custom Salesforce apps.
  • Provide technical mentoring and knowledge sharing when required.
  • Support production environments, troubleshoot issues, and drive root-cause analysis for long-term reliability.
  • Stay current on Salesforce platform updates, releases, and new features, recommending adoption where beneficial.


Required Qualifications:

  • Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience).
  • 6+ years of Salesforce development experience with strong knowledge of Apex, Lightning Web Components, and Salesforce APIs.
  • Proven experience with Salesforce core clouds (Sales Cloud, Service Cloud, or equivalent).
  • Strong hands-on experience with API integrations (REST/SOAP) and event-driven architectures (Kafka, Pub/Sub).
  • Solid understanding of engineering practices: Git-based source control (Salesforce DX/metadata), branching strategies, CI/CD, automated testing, and deployment management.
  • Familiarity with Azure DevOps repositories and pipelines.
  • Strong knowledge of Salesforce data modeling, security, and sharing rules.
  • Excellent problem-solving skills and ability to collaborate across teams.


Preferred Qualifications:

  • Salesforce Platform Developer II certification (or equivalent advanced credentials).
  • Experience with Health Cloud, Financial Services Cloud, or other industry-specific Salesforce products.
  • Experience implementing logging, monitoring, and observability within Salesforce and integrated systems.
  • Background in Agile/Scrum delivery with strong collaboration skills.
  • Prior experience establishing or enforcing engineering standards across Salesforce teams.
Read more
AryuPay Technologies
Bhavana Chaudhari
Posted by Bhavana Chaudhari
Bengaluru (Bangalore), Bhopal
6 - 8 yrs
₹6L - ₹10L / yr
skill iconDjango
RESTful APIs
Software deployment
CI/CD
skill iconPostgreSQL
+7 more

Senior Python Django Developer 

Experience: Back-end development: 6 years (Required)


Location:  Bangalore/ Bhopal

Job Description:

We are looking for a highly skilled Senior Python Django Developer with extensive experience in building and scaling financial or payments-based applications. The ideal candidate has a deep understanding of system design, architecture patterns, and testing best practices, along with a strong grasp of the startup environment.

This role requires a balance of hands-on coding, architectural design, and collaboration across teams to deliver robust and scalable financial products.

Responsibilities:

  • Design and develop scalable, secure, and high-performance applications using Python (Django framework).
  • Architect system components, define database schemas, and optimize backend services for speed and efficiency.
  • Lead and implement design patterns and software architecture best practices.
  • Ensure code quality through comprehensive unit testing, integration testing, and participation in code reviews.
  • Collaborate closely with Product, DevOps, QA, and Frontend teams to build seamless end-to-end solutions.
  • Drive performance improvements, monitor system health, and troubleshoot production issues.
  • Apply domain knowledge in payments and finance, including transaction processing, reconciliation, settlements, wallets, UPI, etc.
  • Contribute to technical decision-making and mentor junior developers.

Requirements:

  • 6 to 10 years of professional backend development experience with Python and Django.
  • Strong background in payments/financial systems or FinTech applications.
  • Proven experience in designing software architecture in a microservices or modular monolith environment.
  • Experience working in fast-paced startup environments with agile practices.
  • Proficiency in RESTful APIs, SQL (PostgreSQL/MySQL), NoSQL (MongoDB/Redis).
  • Solid understanding of Docker, CI/CD pipelines, and cloud platforms (AWS/GCP/Azure).
  • Hands-on experience with test-driven development (TDD) and frameworks like pytest, unittest, or factory_boy.
  • Familiarity with security best practices in financial applications (PCI compliance, data encryption, etc.).

Preferred Skills:

  • Exposure to event-driven architecture (Celery, Kafka, RabbitMQ).
  • Experience integrating with third-party payment gateways, banking APIs, or financial instruments.
  • Understanding of DevOps and monitoring tools (Prometheus, ELK, Grafana).
  • Contributions to open-source or personal finance-related projects.

Job Types: Full-time, Permanent


Schedule:

  • Day shift

Supplemental Pay:

  • Performance bonus
  • Yearly bonus

Ability to commute/relocate:

  • JP Nagar, 5th Phase, Bangalore, Karnataka or Indrapuri, Bhopal, Madhya Pradesh: Reliably commute or willing to relocate with an employer-provided relocation package (Preferred)


 

Read more
Virtana

at Virtana

2 candid answers
Bimla Dhirayan
Posted by Bimla Dhirayan
Pune, Chennai
4 - 10 yrs
Best in industry
skill iconJava
skill iconGo Programming (Golang)
skill iconKubernetes
skill iconPython
Apache Kafka
+13 more

Senior Software Engineer 

Challenge convention and work on cutting edge technology that is transforming the way our customers manage their physical, virtual and cloud computing environments. Virtual Instruments seeks highly talented people to join our growing team, where your contributions will impact the development and delivery of our product roadmap. Our award-winning Virtana Platform provides the only real-time, system-wide, enterprise scale solution for providing visibility into performance, health and utilization metrics, translating into improved performance and availability while lowering the total cost of the infrastructure supporting mission-critical applications.  

We are seeking an individual with expert knowledge in Systems Management and/or Systems Monitoring Software, Observability platforms and/or Performance Management Software and Solutions with insight into integrated infrastructure platforms like Cisco UCS, infrastructure providers like Nutanix, VMware, EMC & NetApp and public cloud platforms like Google Cloud and AWS to expand the depth and breadth of Virtana Products. 


Work Location: Pune/ Chennai


Job Type:Hybrid

 

Role Responsibilities: 

  • The engineer will be primarily responsible for architecture, design and development of software solutions for the Virtana Platform 
  • Partner and work closely with cross functional teams and with other engineers and product managers to architect, design and implement new features and solutions for the Virtana Platform. 
  • Communicate effectively across the departments and R&D organization having differing levels of technical knowledge.  
  • Work closely with UX Design, Quality Assurance, DevOps and Documentation teams. Assist with functional and system test design and deployment automation 
  • Provide customers with complex and end-to-end application support, problem diagnosis and problem resolution 
  • Learn new technologies quickly and leverage 3rd party libraries and tools as necessary to expedite delivery 

 

Required Qualifications:    

  • Minimum of 7+ years of progressive experience with back-end development in a Client Server Application development environment focused on Systems Management, Systems Monitoring and Performance Management Software. 
  • Deep experience in public cloud environment using Kubernetes and other distributed managed services like Kafka etc (Google Cloud and/or AWS) 
  • Experience with CI/CD and cloud-based software development and delivery 
  • Deep experience with integrated infrastructure platforms and experience working with one or more data collection technologies like SNMP, REST, OTEL, WMI, WBEM. 
  • Minimum of 6 years of development experience with one or more of these high level languages like GO, Python, Java. Deep experience with one of these languages is required. 
  • Bachelor’s or Master’s degree in computer science, Computer Engineering or equivalent 
  • Highly effective verbal and written communication skills and ability to lead and participate in multiple projects 
  • Well versed with identifying opportunities and risks in a fast-paced environment and ability to adjust to changing business priorities 
  • Must be results-focused, team-oriented and with a strong work ethic 

 

Desired Qualifications: 

  • Prior experience with other virtualization platforms like OpenShift is a plus 
  • Prior experience as a contributor to engineering and integration efforts with strong attention to detail and exposure to Open-Source software is a plus 
  • Demonstrated ability as a lead engineer who can architect, design and code with strong communication and teaming skills 
  • Deep development experience with the development of Systems, Network and performance Management Software and/or Solutions is a plus 

  

About Virtana:  Virtana delivers the industry’s only broadest and deepest Observability Platform that allows organizations to monitor infrastructure, de-risk cloud migrations, and reduce cloud costs by 25% or more. 

  

Over 200 Global 2000 enterprise customers, such as AstraZeneca, Dell, Salesforce, Geico, Costco, Nasdaq, and Boeing, have valued Virtana’s software solutions for over a decade. 

  

Our modular platform for hybrid IT digital operations includes Infrastructure Performance Monitoring and Management (IPM), Artificial Intelligence for IT Operations (AIOps), Cloud Cost Management (Fin Ops), and Workload Placement Readiness Solutions. Virtana is simplifying the complexity of hybrid IT environments with a single cloud-agnostic platform across all the categories listed above. The $30B IT Operations Management (ITOM) Software market is ripe for disruption, and Virtana is uniquely positioned for success. 

 

Read more
Tecblic Private LImited
Ahmedabad
8 - 10 yrs
₹10L - ₹20L / yr
Datawarehousing
skill iconPython
PowerBI
Tableau
Data Visualization
+5 more

Data Analytics Lead


Responsibilities:

·        Oversee the design, development, and implementation of data analysis solutions to meet business needs.

·        Work closely with business stakeholders and the Aviation SME to define data requirements, project scope, and deliverables.

·        Drive the design and development of analytics data models and data warehouse designs.

·        Develop and maintain data quality standards and procedures.

·        Manage and prioritize data analysis projects, ensuring timely completion.

·        Identify opportunities to improve data analysis processes and tools.

·        Collaborate with Data Engineers and Data Architects to ensure data solutions align with the overall data platform architecture.

·        Evaluate and recommend new data analysis tools and technologies.

·        Contribute to the development of best practices for data analysis.

·        Participate in project meetings and provide input on data-related issues, risks and requirements.

 

Qualifications

·        8+ years of experience as a Data Analytics Lead, with experience leading or mentoring a team.

·        Extensive experience with cloud-based data modelling and data warehousing solutions, using Azure Data Bricks.

·        Proven experience in data technologies and platforms, ETL processes and tools, preferably using Azure Data Factory, Azure Databricks (Spark), Delta Lake.

·        Advanced proficiency in data visualization tools such as Power BI.

 

Data Analysis and Visualization:

  • Experience in data analysis, statistical modelling, and machine learning techniques.
  • Proficiency in analytical tools like Python, R, and libraries such as Pandas, NumPy for data analysis and modelling.
  • Strong expertise in Power BI, Superset, Tablue for data visualization, data modelling, and DAX queries, with knowledge of best practices.
  • Experience in implementing Row-Level Security in Power BI.
  • Ability to work with medium-complex data models and quickly understand application data design and processes.
  • Familiar with industry best practices for Power BI and experienced in performance optimization of existing implementations.
  • Understanding of machine learning algorithms, including supervised, unsupervised, and deep learning techniques.


Data Handling and Processing:

  • Proficient in SQL Server and query optimization.
  • Expertise in application data design and process management.
  • Extensive knowledge of data modelling.
  • Hands-on experience with Azure Data Factory,Azure Databricks.
  • Expertise in data warehouse development, including experience with SSIS (SQL Server Integration Services) and SSAS (SQL Server Analysis Services).
  • Proficiency in ETL processes (data extraction, transformation, and loading), including data cleaning and normalization.
  • Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) for large-scale data processing.

Understanding of data governance, compliance, and security measures within Azure environments.

Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Gurugram
2 - 6 yrs
₹4L - ₹9L / yr
skill iconPython
skill iconDjango
skill iconRedis
RabbitMQ
Celery
+5 more

Job Title : Python Django Developer

Experience : 3+ Years

Location : Gurgaon (Work from Office)


Job Summary :

We are looking for an experienced Python Django Developer with strong expertise in building scalable web applications and distributed systems. The ideal candidate must have hands-on experience with Django, Redis, Celery, RabbitMQ, PostgreSQL, and Kafka to design and optimize high-performance applications.


Mandatory Skills :

Python, Django, Redis, Celery, RabbitMQ, PostgreSQL, Kafka


Key Responsibilities :

  • Design, develop, and maintain web applications using Python & Django.
  • Implement asynchronous tasks and background job processing using Celery with RabbitMQ/Redis.
  • Work with PostgreSQL for database design, optimization, and complex queries.
  • Integrate and optimize messaging/streaming systems using Kafka.
  • Write clean, scalable, and efficient code following best practices.
  • Troubleshoot, debug, and optimize application performance.
  • Collaborate with cross-functional teams (frontend, DevOps, QA) for end-to-end delivery.
  • Stay updated with the latest backend development trends and technologies.

Requirements :

  • Minimum 3+ years of experience in backend development using Python & Django.
  • Hands-on experience with Redis, Celery, RabbitMQ, Kafka, and PostgreSQL.
  • Strong understanding of REST APIs, microservices architecture, and asynchronous task management.
  • Knowledge of performance tuning, caching strategies, and scalable system design.
  • Familiarity with Git, CI/CD pipelines, and cloud deployment (AWS/GCP/Azure) is a plus.
  • Excellent problem-solving and communication skills.
Read more
Virtana

at Virtana

2 candid answers
Bimla Dhirayan
Posted by Bimla Dhirayan
Pune, Chennai
4 - 10 yrs
Best in industry
skill iconJava
skill iconGo Programming (Golang)
skill iconDocker
openshift
network performance
+13 more

Senior Software Engineer 

Challenge convention and work on cutting edge technology that is transforming the way our customers manage their physical, virtual and cloud computing environments. Virtual Instruments seeks highly talented people to join our growing team, where your contributions will impact the development and delivery of our product roadmap. Our award-winning Virtana Platform provides the only real-time, system-wide, enterprise scale solution for providing visibility into performance, health and utilization metrics, translating into improved performance and availability while lowering the total cost of the infrastructure supporting mission-critical applications.  

We are seeking an individual with expert knowledge in Systems Management and/or Systems Monitoring Software, Observability platforms and/or Performance Management Software and Solutions with insight into integrated infrastructure platforms like Cisco UCS, infrastructure providers like Nutanix, VMware, EMC & NetApp and public cloud platforms like Google Cloud and AWS to expand the depth and breadth of Virtana Products. 

 

Role Responsibilities: 

  • The engineer will be primarily responsible for architecture, design and development of software solutions for the Virtana Platform 
  • Partner and work closely with cross functional teams and with other engineers and product managers to architect, design and implement new features and solutions for the Virtana Platform. 
  • Communicate effectively across the departments and R&D organization having differing levels of technical knowledge.  
  • Work closely with UX Design, Quality Assurance, DevOps and Documentation teams. Assist with functional and system test design and deployment automation 
  • Provide customers with complex and end-to-end application support, problem diagnosis and problem resolution 
  • Learn new technologies quickly and leverage 3rd party libraries and tools as necessary to expedite delivery 

 

Required Qualifications:    

  • Minimum of 4-10 years of progressive experience with back-end development in a Client Server Application development environment focused on Systems Management, Systems Monitoring and Performance Management Software. 
  • Deep experience in public cloud environment using Kubernetes and other distributed managed services like Kafka etc (Google Cloud and/or AWS) 
  • Experience with CI/CD and cloud-based software development and delivery 
  • Deep experience with integrated infrastructure platforms and experience working with one or more data collection technologies like SNMP, REST, OTEL, WMI, WBEM. 
  • Minimum of 6 years of development experience with one or more of these high level languages like GO, Python, Java. Deep experience with one of these languages is required. 
  • Bachelor’s or Master’s degree in computer science, Computer Engineering or equivalent 
  • Highly effective verbal and written communication skills and ability to lead and participate in multiple projects 
  • Well versed with identifying opportunities and risks in a fast-paced environment and ability to adjust to changing business priorities 
  • Must be results-focused, team-oriented and with a strong work ethic 

 

Desired Qualifications: 

  • Prior experience with other virtualization platforms like OpenShift is a plus 
  • Prior experience as a contributor to engineering and integration efforts with strong attention to detail and exposure to Open-Source software is a plus 
  • Demonstrated ability as a lead engineer who can architect, design and code with strong communication and teaming skills 
  • Deep development experience with the development of Systems, Network and performance Management Software and/or Solutions is a plus 

  

About Virtana: 

Virtana delivers the industry’s only broadest and deepest Observability Platform that allows organizations to monitor infrastructure, de-risk cloud migrations, and reduce cloud costs by 25% or more. 

  

Over 200 Global 2000 enterprise customers, such as AstraZeneca, Dell, Salesforce, Geico, Costco, Nasdaq, and Boeing, have valued Virtana’s software solutions for over a decade. 

  

Our modular platform for hybrid IT digital operations includes Infrastructure Performance Monitoring and Management (IPM), Artificial Intelligence for IT Operations (AIOps), Cloud Cost Management (Fin Ops), and Workload Placement Readiness Solutions. Virtana is simplifying the complexity of hybrid IT environments with a single cloud-agnostic platform across all the categories listed above. The $30B IT Operations Management (ITOM) Software market is ripe for disruption, and Virtana is uniquely positioned for success. 

 

 

Read more
NeoGenCode Technologies Pvt Ltd
Ritika Verma
Posted by Ritika Verma
Gurugram
3 - 6 yrs
₹6L - ₹9L / yr
skill iconPython
skill iconDjango
skill iconRedis
Celery
RabbitMQ
+2 more

Job Title: Python Developer - Django (Full Time)

Location: Gurgaon, Onsite

Interview: Virtual Interview

Experience Required: 3+ Years

About the Role

We are looking for a skilled Python Developer with hands-on experience in building scalable backend systems. The ideal candidate should have strong expertise in Python, Django, distributed task queues using Celery, Redis, RabbitMQ, and experience working with event streaming platforms like Kafka.

Key Responsibilities

  • Design, develop, and maintain backend services using Python and Django.
  • Implement and optimize task queues using Celery with Redis/RabbitMQ as brokers.
  • Develop and integrate event-driven systems using Apache Kafka.
  • Write clean, reusable, and efficient code following best practices.
  • Build RESTful APIs and integrate with external services.
  • Ensure performance, scalability, and security of applications.
  • Collaborate with frontend developers, DevOps, and product teams to deliver high-quality solutions.
  • Troubleshoot and debug issues in production and staging environments.

Required Skills & Experience

  • 2+ years of professional experience in Python backend development.
  • Strong knowledge of Django Framework.
  • Hands-on experience with Celery, Redis, RabbitMQ, and Kafka.
  • Good understanding of REST API design principles.
  • Experience with relational databases (PostgreSQL/MySQL).
  • Familiarity with version control (Git) and Agile development.
  • Strong problem-solving skills and ability to work in a fast-paced environment.


Read more
MindCrew Technologies

at MindCrew Technologies

3 recruiters
Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Mumbai
6 - 8 yrs
₹10L - ₹15L / yr
skill iconC#
ADO.NET
Entity Framework
skill iconReact.js
skill iconPostgreSQL
+6 more

Role: Dot net +React Developer

Experience- 6+Years

Location- Andheri (Navi Mumbai)

Budget- 18 LPA

Opportunity - Contract 


Technical Expertise:

* Proficiency in OOP concepts, C#, .NET Core, Entity Framework, React, SQL Server, PostgreSQL, Dapper, ADO.NET, LINQ, and Web API Development.

* Experience with Kafka or Rabbit MQ for event-driven architecture and messaging systems.

* Debugging and troubleshooting skills with an understanding of performance optimization.

* Strong knowledge of database development, including tables, views, stored procedures, triggers, and functions.

* Familiarity with unit testing frameworks such as XUnit.

* Experience with JWT services, Git, and third-party API integration.

* Experience in code review of Jr. developer.


Read more
Geektrust
Agency job
via Scaling Theory by Sriraagavi Baskaran
Bengaluru (Bangalore)
2.5 - 5 yrs
₹10L - ₹30L / yr
skill iconJava
Microsoft SharePoint
Apache Kafka
skill iconElastic Search

Role Overview


We are seeking a motivated and technically versatile CMS Engineer (IC2) to support our transition from SharePoint to Contentful, while also contributing to the broader CMS ecosystem. This is an excellent opportunity for an early-career engineer to work on enterprise-grade platforms and microservice-based architectures.


Key Responsibilities

·         3-5 years of experience with SharePoint Online and/or enterprise CMS platforms.

·         Familiarity with Contentful or other headless CMS solutions is a strong plus.

·         Hands-on experience with JavaSpring Boot, and relational databases (e.g., PostgreSQL).

·         Exposure to KafkaElasticsearch, or similar distributed technologies is desirable.

·         Solid problem-solving and communication skills with an eagerness to learn.

 

What would you do here

SharePoint to Contentful Migration | Backend + CMS Integration

 

·         Assist in maintaining and enhancing SharePoint Online content and features during the transition period.

·         Support the migration of pages, documents, and metadata from SharePoint to Contentful.

·         Contribute to the design and development of backend services that integrate with Contentful using Java, Spring Boot, and REST APIs.

·         Write reusable services for content delivery, search indexing (via Elasticsearch), and event processing (via Kafka).

·         Help develop APIs for CMS-based applications that interact with PostgreSQL databases.

·         Troubleshoot CMS-related issues and support testing efforts during platform migration.


Read more
Zenius IT Services Pvt Ltd

at Zenius IT Services Pvt Ltd

2 candid answers
Sunita Pradhan
Posted by Sunita Pradhan
Bengaluru (Bangalore), Hyderabad, Chennai
8 - 13 yrs
₹15L - ₹30L / yr
skill icon.NET
skill iconReact.js
skill iconC#
skill iconJavascript
TypeScript
+9 more

About the Role:

We are looking for a highly skilled Full-Stack Developer with expertise in .NET Core, to develop and maintain scalable web applications and microservices. The ideal candidate will have strong problem-solving skills, experience in modern software development, and a passion for creating robust, high-performance applications.



Key Responsibilities:


Backend Development:

  • Design, develop, and maintain microservices and APIs using.NET Core. Should have a good understanding of .NET Framework.
  • Implement RESTful APIs, ensuring high performance and security.
  • Optimize database queries and design schemas for SQL Server / Snowflake / MongoDB.


Software Architecture & DevOps:

  • Design and implement scalable microservices architecture.
  • Work with Docker, Kubernetes, and CI/CD pipelines for deployment and automation.
  • Ensure best practices in security, scalability, and performance.


Collaboration & Agile Development:

  • Work closely with UI/UX designers, backend engineers, and product managers.
  • Participate in Agile/Scrum ceremonies, code reviews, and knowledge-sharing sessions.
  • Write clean, maintainable, and well-documented code.


Required Skills & Qualifications:

  • 3+ years of experience as a Full-Stack Developer.
  • Strong experience in .NET Core, C#.
  • Proficiency in React.js, JavaScript (ES6+), TypeScript.
  • Experience with RESTful APIs, Microservices architecture.
  • Knowledge of SQL / NoSQL databases (SQL Server, Snowflake, MongoDB).
  • Experience with Git, CI/CD pipelines, Docker, and Kubernetes.
  • Familiarity with Cloud services (Azure, AWS, or GCP) is a plus.
  • Strong debugging and troubleshooting skills.


Nice-to-Have:

  • Experience with GraphQL, gRPC, WebSockets.
  • Exposure to serverless architecture and cloud-based solutions.
  • Knowledge of authentication/authorization frameworks (OAuth, JWT, Identity Server).
  • Experience with unit testing and integration testing.


Read more
empowers digital transformation for innovative and high grow

empowers digital transformation for innovative and high grow

Agency job
via Hirebound by Jebin Joy
Pune
4 - 12 yrs
₹12L - ₹30L / yr
Hadoop
Spark
Apache Kafka
ETL
skill iconJava
+2 more

To be successful in this role, you should possess

• Collaborate closely with Product Management and Engineering leadership to devise and build the

right solution.

• Participate in Design discussions and brainstorming sessions to select, integrate, and maintain Big

Data tools and frameworks required to solve Big Data problems at scale.

• Design and implement systems to cleanse, process, and analyze large data sets using distributed

processing tools like Akka and Spark.

• Understanding and critically reviewing existing data pipelines, and coming up with ideas in

collaboration with Technical Leaders and Architects to improve upon current bottlenecks

• Take initiatives, and show the drive to pick up new stuff proactively, and work as a Senior

Individual contributor on the multiple products and features we have.

• 3+ years of experience in developing highly scalable Big Data pipelines.

• In-depth understanding of the Big Data ecosystem including processing frameworks like Spark,

Akka, Storm, and Hadoop, and the file types they deal with.

• Experience with ETL and Data pipeline tools like Apache NiFi, Airflow etc.

• Excellent coding skills in Java or Scala, including the understanding to apply appropriate Design

Patterns when required.

• Experience with Git and build tools like Gradle/Maven/SBT.

• Strong understanding of object-oriented design, data structures, algorithms, profiling, and

optimization.

• Have elegant, readable, maintainable and extensible code style.


You are someone who would easily be able to

• Work closely with the US and India engineering teams to help build the Java/Scala based data

pipelines

• Lead the India engineering team in technical excellence and ownership of critical modules; own

the development of new modules and features

• Troubleshoot live production server issues.

• Handle client coordination and be able to work as a part of a team, be able to contribute

independently and drive the team to exceptional contributions with minimal team supervision

• Follow Agile methodology, JIRA for work planning, issue management/tracking


Additional Project/Soft Skills:

• Should be able to work independently with India & US based team members.

• Strong verbal and written communication with ability to articulate problems and solutions over phone and emails.

• Strong sense of urgency, with a passion for accuracy and timeliness.

• Ability to work calmly in high pressure situations and manage multiple projects/tasks.

• Ability to work independently and possess superior skills in issue resolution.

• Should have the passion to learn and implement, analyze and troubleshoot issues

Read more
GaragePlug

at GaragePlug

4 candid answers
6 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
3yrs+
Best in industry
skill iconJava
skill iconSpring Boot
skill iconAmazon Web Services (AWS)
Messaging
Amazon SQS
+7 more

GaragePlug Inc

GaragePlug is one of the fastest-growing Automotive tech startups working towards revolutionising the automotive aftermarket industry with strong state-of-the-art technologies.


Role Overview

As we plan to grow, we have many challenges to solve. Some of the new features and products that are already in the pipeline include advanced analytics, search, reporting, etc., to name a few. Our present backend is based on the microservices architecture built using Spring Boot. With growing complexity, we are open to using other tools and technologies as needed. We are looking for a talented and motivated engineer to join our fleet and help us solve real-world problems in this exciting field. Join us and share the dream of building the next-generation online platform for the Auto industry.


What you'll do:

  • Design and architect our core components
  • End-to-end systems development
  • Ownership of complete systems from development to production and maintenance
  • Infrastructure management on AWS

Technologies you'll use:

  • Microservices, AWS, Java, Spring-boot
  • Gradle / Maven
  • ElasticSearch
  • Jenkins, CI/CD
  • Containerization technologies like Docker, Kubernetes, etc.
  • RDBMS (PostgreSQL) or NoSQL databases (MongoDB) & Enterprise Messaging Applications (Kafka/SQS)
  • JUnit, TestNG, Cucumber, etc.
  • Nginx
  • Any cool piece of technology that you can bring onboard


What you are:

  • You love technology and are always open to learning new tools
  • You are proficient with server technologies: Spring / Spring Boot
  • You have good experience in scaling, performance tuning & optimization at both API and storage layers
  • You have an excellent grasp of OOPS concepts, data structures, algorithms, design patterns & REST APIs
  • You are proficient in Java, SQL
  • You have good knowledge of Databases: RDBMS/Document
  • You have a good understanding of REST API design
  • You have knowledge of DevOps
  • Implement Coding Best Practices. Implement Code Quality gates as per the program norms
  • Knowledge of Angular 2+ is a big plus
Read more
Genspark

at Genspark

2 candid answers
Agency job
via hirezyai by HR Hirezyai
Bengaluru (Bangalore), Chennai, Coimbatore
5 - 9 yrs
₹9L - ₹25L / yr
Apache Kafka
Apache
MLOps
skill iconAmazon Web Services (AWS)

The candidate should have extensive experience in designing and developing scalable data pipelines and real-time data processing solutions. As a key member of the team, the Senior Data Engineer will play a critical role in building end-to-end data workflows, supporting machine learning model deployment, and driving MLOps practices in a fast-paced, agile environment. Strong expertise in Apache Kafka, Apache Flink, AWS SageMaker, and Terraform is essential. Additional experience with infrastructure automation and CI/CD for ML models is a significant advantage.

Key Responsibilities

  1. Design, develop, and maintain high-performance ETL and real-time data pipelines using Apache Kafka and Apache Flink.
  2. Build scalable and automated MLOps pipelines for training, validation, and deployment of models using AWS SageMaker and associated services.
  3. Implement and manage Infrastructure as Code (IaC) using Terraform to provision and manage AWS environments.
  4. Collaborate with data scientists, ML engineers, and DevOps teams to streamline model deployment workflows and ensure reliable production delivery.
  5. Optimize data storage and retrieval strategies for large-scale structured and unstructured datasets.
  6. Develop data transformation logic and integrate data from various internal and external sources into data lakes and warehouses.
  7. Monitor, troubleshoot, and enhance performance of data systems in a cloud-native, fast-evolving production setup.
  8. Ensure adherence to data governance, privacy, and security standards across all data handling activities.
  9. Document data engineering solutions and workflows to facilitate cross-functional understanding and ongoing maintenance.


Read more
Gurugram, Bangarmau
5 - 8 yrs
₹12L - ₹15L / yr
skill iconJava
06692
Microservices
SQL
RESTful APIs
+3 more
  • Strong proficiency in Java programming language.
  • Experience with Java frameworks like Spring and Spring Boot.
  • Understanding of RESTful APIs and web services.
  • Experience with databases and data storage technologies (e.g., SQL, NoSQL).
  • Knowledge of software development best practices, including testing and code quality.
  • Experience with version control systems (e.g., Git).
  • Familiarity with cloud platforms (e.g., AWS, Azure, GCP).
  • Strong problem-solving and debugging skills.
  • Excellent communication and collaboration skills. 


Read more
Sonatype

at Sonatype

5 candid answers
Reshika Mendiratta
Posted by Reshika Mendiratta
Hyderabad
6 - 10 yrs
₹15L - ₹33L / yr
ETL
Spark
Apache Kafka
skill iconPython
skill iconJava
+11 more

The Opportunity

We’re looking for a Senior Data Engineer to join our growing Data Platform team. This role is a hybrid of data engineering and business intelligence, ideal for someone who enjoys solving complex data challenges while also building intuitive and actionable reporting solutions.


You’ll play a key role in designing and scaling the infrastructure and pipelines that power analytics, dashboards, machine learning, and decision-making across Sonatype. You’ll also be responsible for delivering clear, compelling, and insightful business intelligence through tools like Looker Studio and advanced SQL queries.


What You’ll Do

  • Design, build, and maintain scalable data pipelines and ETL/ELT processes.
  • Architect and optimize data models and storage solutions for analytics and operational use.
  • Create and manage business intelligence reports and dashboards using tools like Looker Studio, Power BI, or similar.
  • Collaborate with data scientists, analysts, and stakeholders to ensure datasets are reliable, meaningful, and actionable.
  • Own and evolve parts of our data platform (e.g., Airflow, dbt, Spark, Redshift, or Snowflake).
  • Write complex, high-performance SQL queries to support reporting and analytics needs.
  • Implement observability, alerting, and data quality monitoring for critical pipelines.
  • Drive best practices in data engineering and business intelligence, including documentation, testing, and CI/CD.
  • Contribute to the evolution of our next-generation data lakehouse and BI architecture.


What We’re Looking For


Minimum Qualifications

  • 5+ years of experience as a Data Engineer or in a hybrid data/reporting role.
  • Strong programming skills in Python, Java, or Scala.
  • Proficiency with data tools such as Databricks, data modeling techniques (e.g., star schema, dimensional modeling), and data warehousing solutions like Snowflake or Redshift.
  • Hands-on experience with modern data platforms and orchestration tools (e.g., Spark, Kafka, Airflow).
  • Proficient in SQL with experience in writing and optimizing complex queries for BI and analytics.
  • Experience with BI tools such as Looker Studio, Power BI, or Tableau.
  • Experience in building and maintaining robust ETL/ELT pipelines in production.
  • Understanding of data quality, observability, and governance best practices.


Bonus Points

  • Experience with dbt, Terraform, or Kubernetes.
  • Familiarity with real-time data processing or streaming architectures.
  • Understanding of data privacy, compliance, and security best practices in analytics and reporting.


Why You’ll Love Working Here

  • Data with purpose: Work on problems that directly impact how the world builds secure software.
  • Full-spectrum impact: Use both engineering and analytical skills to shape product, strategy, and operations.
  • Modern tooling: Leverage the best of open-source and cloud-native technologies.
  • Collaborative culture: Join a passionate team that values learning, autonomy, and real-world impact.
Read more
Sonatype

at Sonatype

5 candid answers
Reshika Mendiratta
Posted by Reshika Mendiratta
Hyderabad
2 - 5 yrs
Upto ₹20L / yr (Varies
)
skill iconPython
ETL
Spark
Apache Kafka
databricks
+12 more

About the Role

We’re hiring a Data Engineer to join our Data Platform team. You’ll help build and scale the systems that power analytics, reporting, and data-driven features across the company. This role works with engineers, analysts, and product teams to make sure our data is accurate, available, and usable.


What You’ll Do

  • Build and maintain reliable data pipelines and ETL/ELT workflows.
  • Develop and optimize data models for analytics and internal tools.
  • Work with team members to deliver clean, trusted datasets.
  • Support core data platform tools like Airflow, dbt, Spark, Redshift, or Snowflake.
  • Monitor data pipelines for quality, performance, and reliability.
  • Write clear documentation and contribute to test coverage and CI/CD processes.
  • Help shape our data lakehouse architecture and platform roadmap.


What You Need

  • 2–4 years of experience in data engineering or a backend data-related role.
  • Strong skills in Python or another backend programming language.
  • Experience working with SQL and distributed data systems (e.g., Spark, Kafka).
  • Familiarity with NoSQL stores like HBase or similar.
  • Comfortable writing efficient queries and building data workflows.
  • Understanding of data modeling for analytics and reporting.
  • Exposure to tools like Airflow or other workflow schedulers.


Bonus Points

  • Experience with DBT, Databricks, or real-time data pipelines.
  • Familiarity with cloud infrastructure tools like Terraform or Kubernetes.
  • Interest in data governance, ML pipelines, or compliance standards.


Why Join Us?

  • Work on data that supports meaningful software security outcomes.
  • Use modern tools in a cloud-first, open-source-friendly environment.
  • Join a team that values clarity, learning, and autonomy.


If you're excited about building impactful software and helping others do the same, this is an opportunity to grow as a technical leader and make a meaningful impact.

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Mumbai
2 - 4 yrs
₹4L - ₹15L / yr
skill iconMongoDB
skill iconExpress
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconDocker
+3 more

🚀 Hiring: MERN Stack at Deqode

⭐ Experience: 2+ Years

📍 Location: Mumbai

⭐ Work Mode:- 5 Days Work From Office

⏱️ Notice Period: Immediate Joiners or 15 Days

(Only immediate joiners & candidates serving notice period)


MERN Stack (2+ Years of Experience) - Mumbai

🔹 Experience: 2 to 4 Years

🔹Skills: MongoDB, Express, React, Node, Docker, Kubernetes, Kafka


Read more
IT service Based
Gurugram, Bengaluru (Bangalore)
5 - 8 yrs
₹7L - ₹15L / yr
skill iconJava
Multithreading
skill iconSpring Boot
Microservices
Apache Kafka
+1 more

Looking for Java Developer for Gurugram and Bangalore Location with 5+ years of experience win Java + Microservices , Multithreading , Spring boot , Kafka and any MQ Series

Read more
AI Powered Logistics Company

AI Powered Logistics Company

Agency job
via Recruiting Bond by Pavan Kumar
Bengaluru (Bangalore)
5 - 8 yrs
₹25L - ₹32L / yr
skill iconNodeJS (Node.js)
NOSQL Databases
SQL
skill iconMongoDB
RabbitMQ
+19 more

Job Title: Backend Engineer - NodeJS, NestJS, and Python

Location: Hybrid weekly ⅔ days WFO (Bengaluru- India)


About the role:

We are looking for a skilled and passionate Senior Backend Developer to join our dynamic team. The ideal candidate should have strong experience in Node.js and NestJS, along with a solid understanding of database management, query optimization, and microservices architecture. As a backend developer, you will be responsible for developing and maintaining scalable backend systems, building robust APIs, integrating databases, and working closely with frontend and DevOps teams to deliver high-quality software solutions.


What You'll Do 🛠️ 

  • Design, develop, and maintain server-side logic using Node.js, NestJS, and Python.
  • Develop and integrate RESTful APIs and microservices to support scalable systems.
  • Work with NoSQL and SQL databases (e.g., MongoDB, PostgreSQL, MySQL) to create and manage schemas, write complex queries, and optimize performance.
  • Collaborate with cross-functional teams including frontend, DevOps, and QA.
  • Ensure code quality, maintainability, and scalability through code reviews, testing, and documentation.
  • Monitor and troubleshoot production systems, ensuring high availability and performance.
  • Implement security and data protection best practices.


What You'll Bring 💼 

  • 4 to 6 years of professional experience as a backend developer.
  • Strong proficiency in Node.js and NestJS framework.
  • Good hands-on experience with Python (Django/Flask experience is a plus).
  • Solid understanding of relational and non-relational databases.
  • Proficient in writing complex NoSQL queries and SQL queries
  • Experience with microservices architecture and distributed systems.
  • Familiarity with version control systems like Git.
  • Basic understanding of containerization (e.g., Docker) and cloud services is a plus.
  • Excellent problem-solving skills and a collaborative mindset.

 

Bonus Points ➕ 

  • Experience with CI/CD pipelines.
  • Exposure to cloud platforms like AWS, GCP or Azure.
  • Familiarity with event-driven architecture or message brokers (MQTT, Kafka, RabbitMQ)


Why this role matters

You will help build the company from the ground up—shaping our culture and having an impact from Day 1 as part of the foundational team.

Read more
AI Powered Logistics Company

AI Powered Logistics Company

Agency job
via Recruiting Bond by Pavan Kumar
Bengaluru (Bangalore)
2 - 4 yrs
₹12L - ₹15L / yr
skill iconNodeJS (Node.js)
skill iconFlask
RESTful APIs
skill iconMongoDB
SQL
+13 more

Job Title: Backend Engineer - NodeJS, NestJS, and Python

Location: Hybrid weekly ⅔ days WFO (Bengaluru- India)


About the role:

We are looking for a skilled and passionate Senior Backend Developer to join our dynamic team. The ideal candidate should have strong experience in Node.js and NestJS, along with a solid understanding of database management, query optimization, and microservices architecture. As a backend developer, you will be responsible for developing and maintaining scalable backend systems, building robust APIs, integrating databases, and working closely with frontend and DevOps teams to deliver high-quality software solutions.


What You'll Do 🛠️ 

  • Design, develop, and maintain server-side logic using Node.js, NestJS, and Python.
  • Develop and integrate RESTful APIs and microservices to support scalable systems.
  • Work with NoSQL and SQL databases (e.g., MongoDB, PostgreSQL, MySQL) to create and manage schemas, write complex queries, and optimize performance.
  • Collaborate with cross-functional teams including frontend, DevOps, and QA.
  • Ensure code quality, maintainability, and scalability through code reviews, testing, and documentation.
  • Monitor and troubleshoot production systems, ensuring high availability and performance.
  • Implement security and data protection best practices.


What You'll Bring 💼 

  • 4 to 6 years of professional experience as a backend developer.
  • Strong proficiency in Node.js and NestJS framework.
  • Good hands-on experience with Python (Django/Flask experience is a plus).
  • Solid understanding of relational and non-relational databases.
  • Proficient in writing complex NoSQL queries and SQL queries
  • Experience with microservices architecture and distributed systems.
  • Familiarity with version control systems like Git.
  • Basic understanding of containerization (e.g., Docker) and cloud services is a plus.
  • Excellent problem-solving skills and a collaborative mindset.

 

Bonus Points ➕ 

  • Experience with CI/CD pipelines.
  • Exposure to cloud platforms like AWS, GCP or Azure.
  • Familiarity with event-driven architecture or message brokers (MQTT, Kafka, RabbitMQ)


Why this role matters

You will help build the company from the ground up—shaping our culture and having an impact from Day 1 as part of the foundational team.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Yamuna Wissen
Posted by Yamuna Wissen
Mumbai
4 - 9 yrs
Best in industry
skill iconJava
skill iconSpring Boot
Microservices
Apache Kafka

Experience in Core Java and Spring Boot.

 • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. 

• Should have the ability to analyze, design, develop and test complex, low-latency clientfacing applications. 

• Good development experience with RDBMS.

 • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. 

• Excellent problem solving and coding skills. 

• Strong interpersonal, communication and analytical skills. 

• Should have the ability to express their design ideas and thoughts

Read more
Masters India Private Limited

at Masters India Private Limited

3 candid answers
2 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Noida
7yrs+
Upto ₹45L / yr (Varies
)
skill iconPython
skill iconDjango
FastAPI
skill iconPostgreSQL
skill iconMongoDB
+9 more

We are looking for a customer-obsessed, analytical Sr. Staff Engineer to lead the development and growth of our Tax Compliance product suite. In this role, you’ll shape innovative digital solutions that simplify and automate tax filing, reconciliation, and compliance workflows for businesses of all sizes. You will join a fast-growing company where you’ll work in a dynamic and competitive market, impacting how businesses meet their statutory obligations with speed, accuracy, and confidence.


As the Sr. Staff Engineer, you’ll work closely with product, DevOps, and data teams to architect reliable systems, drive engineering excellence, and ensure high availability across our platform. We’re looking for a technical leader who’s not just an expert in building scalable systems, but also passionate about mentoring engineers and shaping the future of fintech.


Responsibilities

  • Lead, mentor, and inspire a high-performing engineering team (or operate as a hands-on technical lead).
  • Drive the design and development of scalable backend services using Python.
  • Experience in Django, FastAPI, Task Orchestration Systems.
  • Own and evolve our CI/CD pipelines with Jenkins, ensuring fast, safe, and reliable deployments.
  • Architect and manage infrastructure using AWS and Terraform with a DevOps-first mindset.
  • Collaborate cross-functionally with product managers, designers, and compliance experts to deliver features that make tax compliance seamless for our users.
  • Set and enforce engineering best practices, code quality standards, and operational excellence.
  • Stay up-to-date with industry trends and advocate for continuous improvement in engineering processes.
  • Experience in fintech, tax, or compliance industries.
  • Familiarity with containerization tools like Docker and orchestration with Kubernetes.
  • Background in security, observability, or compliance automation.

Requirements

  • 7+ years of software engineering experience, with at least 2+ years in a leadership or principal-level role.
  • Deep expertise in Python, including API development, performance optimization, and testing.
  • Experience in Event-driven architecture, Kafka/RabbitMQ-like systems.
  • Strong experience with AWS services (e.g., ECS, Lambda, S3, RDS, CloudWatch).
  • Solid understanding of Terraform for infrastructure as code.
  • Proficiency with Jenkins or similar CI/CD tooling.
  • Comfortable balancing technical leadership with hands-on coding and problem-solving.
  • Strong communication skills and a collaborative mindset.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Rashmi SR
Posted by Rashmi SR
Bengaluru (Bangalore)
4 - 9 yrs
Best in industry
skill iconJava
skill iconSpring Boot
Apache Kafka
Data Structures
Algorithms
+2 more

Java Developer – Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting. Required Skills: • Exp. - 4 to 7 years. • Experience in Core Java and Spring Boot. • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills. • Strong interpersonal, communication and analytical skills. • Should have the ability to express their design ideas and thoughts. About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider. 

Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Bengaluru (Bangalore), Hyderabad, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 7 yrs
₹10L - ₹25L / yr
Microsoft Windows Azure
Data engineering
skill iconPython
Apache Kafka

Role Overview:

We are seeking a Senior Software Engineer (SSE) with strong expertise in Kafka, Python, and Azure Databricks to lead and contribute to our healthcare data engineering initiatives. This role is pivotal in building scalable, real-time data pipelines and processing large-scale healthcare datasets in a secure and compliant cloud environment.

The ideal candidate will have a solid background in real-time streaming, big data processing, and cloud platforms, along with strong leadership and stakeholder engagement capabilities.

Key Responsibilities:

  • Design and develop scalable real-time data streaming solutions using Apache Kafka and Python.
  • Architect and implement ETL/ELT pipelines using Azure Databricks for both structured and unstructured healthcare data.
  • Optimize and maintain Kafka applications, Python scripts, and Databricks workflows to ensure performance and reliability.
  • Ensure data integrity, security, and compliance with healthcare standards such as HIPAA and HITRUST.
  • Collaborate with data scientists, analysts, and business stakeholders to gather requirements and translate them into robust data solutions.
  • Mentor junior engineers, perform code reviews, and promote engineering best practices.
  • Stay current with evolving technologies in cloud, big data, and healthcare data standards.
  • Contribute to the development of CI/CD pipelines and containerized environments (Docker, Kubernetes).

Required Skills & Qualifications:

  • 4+ years of hands-on experience in data engineering roles.
  • Strong proficiency in Kafka (including Kafka Streams, Kafka Connect, Schema Registry).
  • Proficient in Python for data processing and automation.
  • Experience with Azure Databricks (or readiness to ramp up quickly).
  • Solid understanding of cloud platforms, with a preference for Azure (AWS/GCP is a plus).
  • Strong knowledge of SQL and NoSQL databases; data modeling for large-scale systems.
  • Familiarity with containerization tools like Docker and orchestration using Kubernetes.
  • Exposure to CI/CD pipelines for data applications.
  • Prior experience with healthcare datasets (EHR, HL7, FHIR, claims data) is highly desirable.
  • Excellent problem-solving abilities and a proactive mindset.
  • Strong communication and interpersonal skills to work in cross-functional teams.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Robin Silverster
Posted by Robin Silverster
Bengaluru (Bangalore)
5 - 8 yrs
Best in industry
skill iconJava
skill iconSpring Boot
Apache Kafka
Microservices
Multithreading
+2 more

Java Developer – Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting. Required Skills: • Exp. - 4 to 7 years. • Experience in Core Java and Spring Boot. • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills. • Strong interpersonal, communication and analytical skills. • Should have the ability to express their design ideas and thoughts. About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider. 

Read more
QAgile Services

at QAgile Services

1 recruiter
Radhika Chotai
Posted by Radhika Chotai
Bengaluru (Bangalore)
3 - 8 yrs
₹17L - ₹25L / yr
skill iconMongoDB
skill iconPython
skill iconFlask
skill iconDjango
Windows Azure
+4 more

Employment type- Contract basis


Key Responsibilities

  • Design, develop, and maintain scalable data pipelines using PySpark and distributed computing frameworks.
  • Implement ETL processes and integrate data from structured and unstructured sources into cloud data warehouses.
  • Work across Azure or AWS cloud ecosystems to deploy and manage big data workflows.
  • Optimize performance of SQL queries and develop stored procedures for data transformation and analytics.
  • Collaborate with Data Scientists, Analysts, and Business teams to ensure reliable data availability and quality.
  • Maintain documentation and implement best practices for data architecture, governance, and security.

⚙️ Required Skills

  • Programming: Proficient in PySpark, Python, and SQL, MongoDB
  • Cloud Platforms: Hands-on experience with Azure Data Factory, Databricks, or AWS Glue/Redshift.
  • Data Engineering Tools: Familiarity with Apache Spark, Kafka, Airflow, or similar tools.
  • Data Warehousing: Strong knowledge of designing and working with data warehouses like Snowflake, BigQuery, Synapse, or Redshift.
  • Data Modeling: Experience in dimensional modeling, star/snowflake schema, and data lake architecture.
  • CI/CD & Version Control: Exposure to Git, Terraform, or other DevOps tools is a plus.

🧰 Preferred Qualifications

  • Bachelor's or Master's in Computer Science, Engineering, or related field.
  • Certifications in Azure/AWS are highly desirable.
  • Knowledge of business intelligence tools (Power BI, Tableau) is a bonus.
Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Bengaluru (Bangalore), Mumbai, Nagpur, Ahmedabad, Kochi (Cochin), Chennai
6 - 11 yrs
₹4L - ₹15L / yr
skill iconJava
skill iconSpring Boot
Microservices
Apache Kafka
Spring

Job Description:

We are looking for a Senior Java Developer with strong expertise in Apache Kafka and backend systems. The ideal candidate will have hands-on experience in Java (8/11+), Spring Boot, and building scalable, real-time data pipelines using Kafka.

Key Responsibilities:

  • Develop and maintain backend services using Java and Spring Boot
  • Design and implement Kafka-based messaging and streaming solutions
  • Optimize Kafka performance (topics, partitions, consumers)
  • Collaborate with cross-functional teams to deliver scalable microservices
  • Ensure code quality and maintain best practices in a distributed environment

Required Skills:

  • 6+ years in Java development
  • 3+ years of hands-on Kafka experience (producers, consumers, streams)
  • Strong knowledge of Spring Boot, REST APIs, and microservices
  • Familiarity with Kafka Connect, Schema Registry, and stream processing
  • Experience with containerization (Docker), CI/CD, and cloud platforms (AWS/GCP/Azure)


Read more
KJBN labs

at KJBN labs

2 candid answers
sakthi ganesh
Posted by sakthi ganesh
Bengaluru (Bangalore)
4 - 7 yrs
₹10L - ₹30L / yr
Hadoop
Apache Kafka
Spark
skill iconPython
skill iconJava
+8 more

Senior Data Engineer Job Description

Overview

The Senior Data Engineer will design, develop, and maintain scalable data pipelines and

infrastructure to support data-driven decision-making and advanced analytics. This role requires deep

expertise in data engineering, strong problem-solving skills, and the ability to collaborate with

cross-functional teams to deliver robust data solutions.

Key Responsibilities


Data Pipeline Development: Design, build, and optimize scalable, secure, and reliable data

pipelines to ingest, process, and transform large volumes of structured and unstructured data.

Data Architecture: Architect and maintain data storage solutions, including data lakes, data

warehouses, and databases, ensuring performance, scalability, and cost-efficiency.

Data Integration: Integrate data from diverse sources, including APIs, third-party systems,

and streaming platforms, ensuring data quality and consistency.

Performance Optimization: Monitor and optimize data systems for performance, scalability,

and cost, implementing best practices for partitioning, indexing, and caching.

Collaboration: Work closely with data scientists, analysts, and software engineers to

understand data needs and deliver solutions that enable advanced analytics, machine

learning, and reporting.

Data Governance: Implement data governance policies, ensuring compliance with data

security, privacy regulations (e.g., GDPR, CCPA), and internal standards.

Automation: Develop automated processes for data ingestion, transformation, and validation

to improve efficiency and reduce manual intervention.

Mentorship: Guide and mentor junior data engineers, fostering a culture of technical

excellence and continuous learning.

Troubleshooting: Diagnose and resolve complex data-related issues, ensuring high

availability and reliability of data systems.

Required Qualifications

Education: Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science,

or a related field.

Experience: 5+ years of experience in data engineering or a related role, with a proven track

record of building scalable data pipelines and infrastructure.

Technical Skills:

Proficiency in programming languages such as Python, Java, or Scala.

Expertise in SQL and experience with NoSQL databases (e.g., MongoDB, Cassandra).

Strong experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services

(e.g., Redshift, BigQuery, Snowflake).

Hands-on experience with ETL/ELT tools (e.g., Apache Airflow, Talend, Informatica) and

data integration frameworks.

Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) and distributed

systems.

Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes) is a

plus.

Soft Skills:

Excellent problem-solving and analytical skills.

Strong communication and collaboration abilities.

Ability to work in a fast-paced, dynamic environment and manage multiple priorities.

Certifications (optional but preferred): Cloud certifications (e.g., AWS Certified Data Analytics,

Google Professional Data Engineer) or relevant data engineering certifications.

Preferred Qualifica

Experience with real-time data processing and streaming architectures.

Familiarity with machine learning pipelines and MLOps practices.

Knowledge of data visualization tools (e.g., Tableau, Power BI) and their integration with data

pipelines.

Experience in industries with high data complexity, such as finance, healthcare, or

e-commerce.

Work Environment

Location: Hybrid/Remote/On-site (depending on company policy).

Team: Collaborative, cross-functional team environment with data scientists, analysts, and

business stakeholders.

Hours: Full-time, with occasional on-call responsibilities for critical data systems.

Read more
strektech
Venitha N
Posted by Venitha N
Chennai
5 - 10 yrs
₹15L - ₹30L / yr
skill iconJava
Apache Kafka
Messaging
Multithreading
Microservices

Hiring for Java Developer


Experience : 5 to 10 yrs

Notice Period : 0 to 15 days

Location : Pune

Work Mode : WFO (5 days)


As Java developer you would be expected to perform many duties throughout the development lifecycle of applications, from concept and design right through to testing. Here are some of the responsibilities you may have:


Develop high-level design and define software architecture

Implement and maintain quality systems within the group

Proficiently estimates, design approaches and nimbly move to alternate apporaches, if needed, develop and execute unit test strategies

Monitor and track tasks, and report status

Assist project heads to conceptualize, design, develop, test and implement technology solutions

Effectively collaborate with stakeholders and users to ensure customer satisfaction


Skill Set :

Java 7 / Java 8 with microservices, Multithreading, Springboot, Junit, kafka, Splunk (Good to have), Open Shift (Good to Have), Authentication/ Spring Security (Good to have)


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Hyderabad, Bengaluru (Bangalore), Mumbai, Pune, Gurugram, Chennai
4 - 10 yrs
₹5L - ₹20L / yr
Red Hat PAM Developer
Red Hat Process Automation Manager (PAM)
jBPM
JBPM (Java Business Process Management)
BPMN 2.0 (Business Process Model and Notation)
+6 more

Job Title : Red Hat PAM Developer

Experience Required :

  • Relevant Experience : 4+ Years
  • Total Experience : Up to 8 Years

No. of Positions : 4

Work Locations : Hyderabad / Bangalore / Mumbai / Pune / Gurgaon / Chennai

Work Mode : Hybrid

Work Timings : 1:00 PM to 10:00 PM IST

Interview Mode : Virtual

Interview Rounds : 2


Mandatory Skills :

  • Excellent communication skills – must be comfortable in client-facing roles
  • Red Hat Process Automation Manager (PAM)
  • JBPM (Java Business Process Management)
  • BPMN 2.0 (Business Process Model and Notation) – low-code platform
  • DMN (Decision Model and Notation) – business processes and rules
  • Spring Boot
  • JavaScript

Good-to-Have Skills :

  • Red Hat Fuse
  • Apache Kafka
  • Apache Camel
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Robin Silverster
Posted by Robin Silverster
Bengaluru (Bangalore)
11 - 18 yrs
Best in industry
skill iconJava
skill iconSpring Boot
Microservices
Multithreading
Data Structures
+3 more

Java Developer – Job Description


Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting. Required Skills: • Exp. - 5 to 12 years. • Experience in Core Java and Spring Boot. • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills. • Strong interpersonal, communication and analytical skills. • Should have the ability to express their design ideas and thoughts. About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider. 

Read more
Data Havn

Data Havn

Agency job
via Infinium Associate by Toshi Srivastava
Noida
5 - 9 yrs
₹40L - ₹60L / yr
skill iconPython
SQL
Data engineering
Snowflake
ETL
+5 more

About the Role:

We are seeking a talented Lead Data Engineer to join our team and play a pivotal role in transforming raw data into valuable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and infrastructure to support our organization's analytics and decision-making processes.

Responsibilities:

  • Data Pipeline Development: Build and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes.
  • Data Infrastructure: Design, implement, and manage data infrastructure components, including data warehouses, data lakes, and data marts.
  • Data Quality: Ensure data quality by implementing data validation, cleansing, and standardization processes.
  • Team Management: Able to handle team.
  • Performance Optimization: Optimize data pipelines and infrastructure for performance and efficiency.
  • Collaboration: Collaborate with data analysts, scientists, and business stakeholders to understand their data needs and translate them into technical requirements.
  • Tool and Technology Selection: Evaluate and select appropriate data engineering tools and technologies (e.g., SQL, Python, Spark, Hadoop, cloud platforms).
  • Documentation: Create and maintain clear and comprehensive documentation for data pipelines, infrastructure, and processes.

 

 

 

 

Skills:

  • Strong proficiency in SQL and at least one programming language (e.g., Python, Java).
  • Experience with data warehousing and data lake technologies (e.g., Snowflake, AWS Redshift, Databricks).
  • Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and cloud-based data services.
  • Understanding of data modeling and data architecture concepts.
  • Experience with ETL/ELT tools and frameworks.
  • Excellent problem-solving and analytical skills.
  • Ability to work independently and as part of a team.

Preferred Qualifications:

  • Experience with real-time data processing and streaming technologies (e.g., Kafka, Flink).
  • Knowledge of machine learning and artificial intelligence concepts.
  • Experience with data visualization tools (e.g., Tableau, Power BI).
  • Certification in cloud platforms or data engineering.


Read more
NeoGenCode Technologies Pvt Ltd
Gurugram
3 - 6 yrs
₹2L - ₹12L / yr
skill iconPython
skill iconDjango
skill iconPostgreSQL
MySQL
SQL
+17 more

Job Title : Python Django Developer

Experience : 3+ Years

Location : Gurgaon

Working Days : 6 Days (Monday to Saturday)


Job Summary :

We are looking for a skilled Python Django Developer with strong foundational knowledge in backend development, data structures, and operating system concepts.

The ideal candidate should have experience in Django and PostgreSQL, along with excellent logical thinking and multithreading knowledge.


Technical Skills : Python, Django (or Flask), PostgreSQL/MySQL, SQL & NoSQL ORM, REST API development, JSON/XML, strong knowledge of data structures, multithreading, and OS concepts.


Key Responsibilities :

  • Write efficient, reusable, testable, and scalable code using the Django framework.
  • Develop backend components, server-side logic, and statistical models.
  • Design and implement high-availability, low-latency applications with robust data protection and security.
  • Contribute to the development of highly responsive web applications.
  • Collaborate with cross-functional teams on system design and integration.

Mandatory Skills :

  • Strong programming skills in Python and Django (or similar frameworks like Flask).
  • Proficiency with PostgreSQL / MySQL and experience in writing complex queries.
  • Strong understanding of SQL and NoSQL ORM.
  • Solid grasp of data structures, multithreading, and operating system concepts.
  • Experience with RESTful API development and implementation of API security.
  • Knowledge of JSON/XML and their use in data exchange.

Good-to-Have Skills :

  • Experience with Redis, MQTT, and message queues like RabbitMQ or Kafka
  • Understanding of microservice architecture and third-party API integrations (e.g., payment gateways, SMS/email APIs)
  • Familiarity with MongoDB and other NoSQL databases
  • Exposure to data science libraries such as Pandas, NumPy, Scikit-learn
  • Knowledge in building and integrating statistical learning models.
Read more
iris software

at iris software

2 candid answers
Parveen Kaur
Posted by Parveen Kaur
Pune, Noida
5 - 8 yrs
₹20L - ₹25L / yr
skill iconJava
Apache Kafka
06692
Multithreading
Microservices
+1 more

Job Role: We are seeking a skilled Java Developer to contribute to the development and enhancement renowned banking application, which supports automatic reconciliation and unified data reporting for their clients. This role involves working on high-impact enhancements, data pipeline integration, and platform modernization. The ideal candidate will be a quick learner, self-motivated, and able to ramp up quickly in a fast-paced environment. 


Key Responsibilities:

 Design, develop, and maintain Java-based applications using Java 17 and Spring Boot.

 Implement and manage message routing using Apache Camel.

 Develop and monitor data pipelines using Kafka.

 Support and enhance existing cloud-native applications.

 Work with OpenShift Container Platform (OCP 4) for container orchestration and deployments.

 Utilize Jenkins for CI/CD pipeline automation and management.

 Collaborate with cross-functional teams to integrate multiple data sources into a unified reporting platform.

 Participate in code reviews, unit testing, and performance tuning.

 Troubleshoot and resolve production issues in collaboration with operations teams.

 Document development processes and system configurations.



Required Skills:

 Strong proficiency in Java 17 and Spring Boot frameworks.

 Hands-on experience with Apache Camel for message routing and transformation.

 Solid experience in Kafka development and monitoring tools.

 Good understanding of cloud pipeline architectures and deployment strategies.

 Experience working with OpenShift (OCP 4).

 Familiarity with Jenkins for CI/CD and automated deployments.

 Understanding of cloud deployment platforms (AWS, Azure, or GCP preferred).

 Strong analytical and debugging skills.

 Ability to learn quickly and adapt to evolving project requirements.



Nice to Have:

 Experience in financial services or transaction reporting platforms.

 Familiarity with microservices architecture and containerization best practices.

 Knowledge of monitoring tools (e.g., Prometheus, Grafana).

Read more
Hyderabad
5 - 8 yrs
₹24L - ₹30L / yr
Apache Kafka
skill iconElastic Search
skill iconNodeJS (Node.js)
ETL
skill iconPython
+2 more

Company Overview

We are a dynamic startup dedicated to empowering small businesses through innovative technology solutions. Our mission is to level the playing field for small businesses by providing them with powerful tools to compete effectively in the digital marketplace. Join us as we revolutionize the way small businesses operate online, bringing innovation and growth to local communities.


Job Description

We are seeking a skilled and experienced Data Engineer to join our team. In this role, you will develop systems on cloud platforms capable of processing millions of interactions daily, leveraging the latest cloud computing and machine learning technologies while creating custom in-house data solutions. The ideal candidate should have hands-on experience with SQL, PL/SQL, and any standard ETL tools. You must be able to thrive in a fast-paced environment and possess a strong passion for coding and problem-solving.


Required Skills and Experience

  • Minimum 5 years of experience in software development.
  • 3+ years of experience in data management and SQL expertise – PL/SQL, Teradata, and Snowflake experience strongly preferred.
  • Expertise in big data technologies such as Hadoop, HiveQL, and Spark (Scala/Python).
  • Expertise in cloud technologies – AWS (S3, Glue, Terraform, Lambda, Aurora, Redshift, EMR).
  • Experience with queuing systems (e.g., SQS, Kafka) and caching systems (e.g., Ehcache, Memcached).
  • Experience with container management tools (e.g., Docker Swarm, Kubernetes).
  • Familiarity with data stores, including at least one of the following: Postgres, MongoDB, Cassandra, or Redis.
  • Ability to create advanced visualizations and dashboards to communicate complex findings (e.g., Looker Studio, Power BI, Tableau).
  • Strong skills in manipulating and transforming complex datasets for in-depth analysis.
  • Technical proficiency in writing code in Python and advanced SQL queries.
  • Knowledge of AI/ML infrastructure, best practices, and tools is a plus.
  • Experience in analyzing and resolving code issues.
  • Hands-on experience with software architecture concepts such as Separation of Concerns (SoC) and micro frontends with theme packages.
  • Proficiency with the Git version control system.
  • Experience with Agile development methodologies.
  • Strong problem-solving skills and the ability to learn quickly.
  • Exposure to Docker and Kubernetes.
  • Familiarity with AWS or other cloud platforms.


Responsibilities

  • Develop and maintain our inhouse search and reporting platform
  • Create data solutions to complement core products to improve performance and data quality
  • Collaborate with the development team to design, develop, and maintain our suite of products.
  • Write clean, efficient, and maintainable code, adhering to coding standards and best practices.
  • Participate in code reviews and testing to ensure high-quality code.
  • Troubleshoot and debug application issues as needed.
  • Stay up-to-date with emerging trends and technologies in the development community.


How to apply?

  • If you are passionate about designing user-centric products and want to be part of a forward-thinking company, we would love to hear from you. Please send your resume, a brief cover letter outlining your experience and your current CTC (Cost to Company) as a part of the application.


Join us in shaping the future of e-commerce!

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Seema Srivastava
Posted by Seema Srivastava
Bengaluru (Bangalore), Mumbai
5 - 10 yrs
Best in industry
skill iconJava
06692
Microservices
skill iconAmazon Web Services (AWS)
Apache Kafka
+1 more

Job Description: We are looking for a talented and motivated Software Engineer with

expertise in both Windows and Linux operating systems and solid experience in Java

technologies. The ideal candidate should be proficient in data structures and algorithms, as

well as frameworks like Spring MVC, Spring Boot, and Hibernate. Hands-on experience

working with MySQL databases is also essential for this role.


Responsibilities:

● Design, develop, test, and maintain software applications using Java technologies.

● Implement robust solutions using Spring MVC, Spring Boot, and Hibernate frameworks.

● Develop and optimize database operations with MySQL.

● Analyze and solve complex problems by applying knowledge of data structures and

algorithms.

● Work with both Windows and Linux environments to develop and deploy solutions.

● Collaborate with cross-functional teams to deliver high-quality products on time.

● Ensure application security, performance, and scalability.

● Maintain thorough documentation of technical solutions and processes.

● Debug, troubleshoot, and upgrade legacy systems when required.

Requirements:

● Operating Systems: Expertise in Windows and Linux environments.

● Programming Languages & Technologies: Strong knowledge of Java (Core Java, Java 8+).

● Frameworks: Proficiency in Spring MVC, Spring Boot, and Hibernate.

● Algorithms and Data Structures: Good understanding and practical application of DSA

concepts.

● Databases: Experience with MySQL – writing queries, stored procedures, and performance

tuning.

● Version Control Systems: Experience with tools like Git.

● Deployment: Knowledge of CI/CD pipelines and tools such as Jenkins, Docker (optional)

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Deepa Shankar
Posted by Deepa Shankar
Bengaluru (Bangalore)
4 - 9 yrs
Best in industry
skill iconJava
skill iconSpring Boot
Microservices
Apache Kafka
Multitasking
+2 more

Required Skillset

• Experience in Core Java 1.8 and above, Data Structures, OOPS, Multithreading, Algorithms, Collections, System Design, Unix/Linux. • Possess good architectural knowledge and be aware of enterprise application design patterns. • Should be able to analyze, design, develop and test complex, low-latency client-facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-volume server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills in Java. • Strong interpersonal, communication and analytical skills. • Should be able to express their design ideas and thoughts.


Job Brief-

• Understand product requirements and come up with solution approaches. • Build and enhance large scale domain centric applications. • Deploy high quality deliverables into production adhering to the security, compliance and SDLC guidelines.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Robin Silverster
Posted by Robin Silverster
Mumbai
5 - 12 yrs
Best in industry
skill iconJava
skill iconSpring Boot
Apache Kafka
Microservices
Data Structures
+2 more

Java Developer – Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting. Required Skills: • Exp. - 4 to 7 years. • Experience in Core Java and Spring Boot. • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills. • Strong interpersonal, communication and analytical skills. • Should have the ability to express their design ideas and thoughts. About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider. 

Read more
Wissen Technology

at Wissen Technology

4 recruiters
VenkataRamanan S
Posted by VenkataRamanan S
Bengaluru (Bangalore)
3 - 8 yrs
Best in industry
06692
Microservices
Apache Kafka

Key Responsibilities:

  • Design, develop, and maintain robust and scalable backend applications using Core Java and Spring Boot.
  • Build and manage microservices-based architectures and ensure smooth inter-service communication.
  • Integrate and manage real-time data streaming using Apache Kafka.
  • Write clean, maintainable, and efficient code following best practices.
  • Collaborate with cross-functional teams including QA, DevOps, and product management.
  • Participate in code reviews and provide constructive feedback.
  • Troubleshoot, debug, and optimize applications for performance and scalability.

Required Skills:

  • Strong knowledge of Core Java (Java 8 or above).
  • Hands-on experience with Spring Boot and the Spring ecosystem (Spring MVC, Spring Data, Spring Security).
  • Experience in designing and developing RESTful APIs.
  • Solid understanding of Microservices architecture and related patterns.
  • Practical experience with Apache Kafka for real-time data processing.
  • Familiarity with SQL/NoSQL databases such as MySQL, PostgreSQL, or MongoDB.
  • Good understanding of CI/CD tools and practices.
  • Knowledge of containerization tools like Docker is a plus.
  • Strong problem-solving skills and attention to detail.


Read more
One Impression

at One Impression

1 video
4 recruiters
Achin Sood
Posted by Achin Sood
Gurugram
1 - 3 yrs
Best in industry
Problem solving
Data Structures
MySQL
skill iconMongoDB
DynamoDB
+9 more

We are looking for passionate people who love solving interesting and complex technology challenges, who are enthusiastic about building an industry first innovative product to solve new age real world problems. This role requires strategic leadership, the ability to manage complex technical challenges, and the ability to drive innovation while ensuring operational excellence. As a Backend SDE-2, you will collaborate with key stakeholders across the business, product management, and operations to ensure alignment with the organization's goals and play a critical role in shaping the technology roadmap and engineering culture.


Key Responsibilities


  • Strategic Planning: Work closely with senior leadership to develop and implement engineering strategies that support business objectives. Understand broader organization goals and prepare technology roadmaps.
  • Technical Excellence: Guide the team in designing and implementing scalable, extensible and secure software systems. Drive the adoption of best practices in technical architecture, coding standards, and software testing to ensure product delivery with highest speed AND quality.
  • Project and Program Management: Setting up aggressive as well as realistic timelines with all the stakeholders, ensure the successful delivery of engineering projects as per the defined timelines with best quality standards ensuring budget constraints are met. Use agile methodologies to manage the development process and resolve bottlenecks.
  • Cross-functional collaboration: Collaborate with Product Management, Design, Business, and Operations teams to define project requirements and deliverables. Ensure the smooth integration of engineering efforts across the organization.
  • Risk Management: Anticipate and mitigate technical risks and roadblocks. Proactively identify areas of technical debt and drive initiatives to reduce it.


Required Qualifications


  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • 1-3 years of experience in software engineering
  • Excellent problem-solving skills, with the ability to diagnose and resolve complex technical challenges.
  • Proven track record of successfully delivering large-scale, high-impact software projects.
  • Strong understanding of software design principles and patterns.
  • Expertise in multiple programming languages and modern development frameworks.
  • Experience with cloud infrastructure (AWS), microservices, and distributed systems.
  • Experience with releational and non-relational databases.
  • Experience with Redis, ElasticSearch.
  • Experience in DevOps, CI/CD pipelines, and infrastructure automation.
  • Strong communication and interpersonal skills, with the ability to influence and inspire teams and stakeholders at all levels.


Skills:- MySQL, Python, Django, AWS, NoSQL, Kafka, Redis, ElasticSearch

Read more
Cloudesign Technology Solutions
Anshul Saxena
Posted by Anshul Saxena
Noida, Hyderabad, Bengaluru (Bangalore)
5 - 8 yrs
₹25L - ₹30L / yr
Apache Kafka
Kafka Cluster
Schema Registry
Streaming
Kafka
+1 more

Job Title: Tech Lead and SSE – Kafka, Python, and Azure Databricks (Healthcare Data Project)

Experience: 4 to 12 years


Role Overview:

We are looking for a highly skilled Tech Lead with expertise in Kafka, Python, and Azure Databricks (preferred) to drive our healthcare data engineering projects. The ideal candidate will have deep experience in real-time data streaming, cloud-based data platforms, and large-scale data processing. This role requires strong technical leadership, problem-solving abilities, and the ability to collaborate with cross-functional teams.


Key Responsibilities:

  • Lead the design, development, and implementation of real-time data pipelines using Kafka, Python, and Azure Databricks.
  • Architect scalable data streaming and processing solutions to support healthcare data workflows.
  • Develop, optimize, and maintain ETL/ELT pipelines for structured and unstructured healthcare data.
  • Ensure data integrity, security, and compliance with healthcare regulations (HIPAA, HITRUST, etc.).
  • Collaborate with data engineers, analysts, and business stakeholders to understand requirements and translate them into technical solutions.
  • Troubleshoot and optimize Kafka streaming applications, Python scripts, and Databricks workflows.
  • Mentor junior engineers, conduct code reviews, and ensure best practices in data engineering.
  • Stay updated with the latest cloud technologies, big data frameworks, and industry trends.


Required Skills & Qualifications:

  • 4+ years of experience in data engineering, with strong proficiency in Kafka and Python.
  • Expertise in Kafka Streams, Kafka Connect, and Schema Registry for real-time data processing.
  • Experience with Azure Databricks (or willingness to learn and adopt it quickly).
  • Hands-on experience with cloud platforms (Azure preferred, AWS or GCP is a plus).
  • Proficiency in SQL, NoSQL databases, and data modeling for big data processing.
  • Knowledge of containerization (Docker, Kubernetes) and CI/CD pipelines for data applications.
  • Experience working with healthcare data (EHR, claims, HL7, FHIR, etc.) is a plus.
  • Strong analytical skills, problem-solving mindset, and ability to lead complex data projects.
  • Excellent communication and stakeholder management skills.


Read more
One Impression

at One Impression

1 video
4 recruiters
Achin Sood
Posted by Achin Sood
Gurugram
0 - 2 yrs
Best in industry
Problem solving
Data Structures
MySQL
DynamoDB
skill iconMongoDB
+9 more

We're seeking passionate, next-gen minded engineers who are excited about solving complex technical challenges and building innovative, first-of-its-kind products which make a tangible difference for our customers. As a Backend SDE-1, you will play a key role in driving strategic initiatives, collaborating with cross-functional teams across business, product, and operations to solve exciting problems. This role demands strong technical acumen, leadership capabilities, and a mindset focused on innovation and operational excellence.

We value individuals who think independently, challenge the status quo, and bring creativity and curiosity to the table—not those who simply follow instructions. If you're passionate about solving problems and making an impact, we'd love to hear from you.


Key Responsibilities


  • Strategic Planning: Work closely with senior leadership to develop and implement engineering strategies that support business objectives. Understand broader organization goals and constantly prioritise your own work.
  • Technical Excellence: Understand the onground problems, explore and design various possible solutions to conclude and implement scalable, extensible and secure software systems. Implement and learn best practices in technical architecture, coding standards, and software testing to ensure product delivery with highest speed AND quality.
  • Project and Program Management: Setting up aggressive as well as realistic timelines with all the stakeholders, ensure the successful delivery of engineering projects as per the defined timelines with best quality standards ensuring budget constraints are met. Use agile methodologies to manage the development process and resolve bottlenecks.
  • Cross-functional collaboration: Collaborate with Product Managers, Design, Business, and Operations teams to define project requirements and deliverables. Ensure the smooth integration of engineering efforts across the organization.
  • Risk Management: Anticipate and mitigate technical risks and roadblocks. Proactively identify areas of technical debt and drive initiatives to reduce it.


Required Qualifications


  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • 1+ years of experience in software engineering
  • Excellent problem-solving skills, with the ability to diagnose and resolve complex technical challenges.
  • Strong understanding of software design principles and patterns.
  • Hands on with multiple programming languages and modern development frameworks.
  • Understanding of relational and non-relational databases.
  • Experience with Redis, ElasticSearch.
  • Strong communication and interpersonal skills, with the ability to influence and inspire teams and stakeholders at all levels.


Skills:- MySQL, Python, Django, AWS, NoSQL, Kafka, Redis, ElasticSearch

Read more
Zazmic
Remote only
9 - 12 yrs
₹10L - ₹15L / yr
skill iconPython
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconAmazon Web Services (AWS)
CI/CD
+5 more

Title: Senior Software Engineer – Python (Remote: Africa, India, Portugal)


Experience: 9 to 12 Years


INR : 40 LPA - 50 LPA


Location Requirement: Candidates must be based in Africa, India, or Portugal. Applicants outside these regions will not be considered.


Must-Have Qualifications:

  • 8+ years in software development with expertise in Python
  • kubernetes is important
  • Strong understanding of async frameworks (e.g., asyncio)
  • Experience with FastAPI, Flask, or Django for microservices
  • Proficiency with Docker and Kubernetes/AWS ECS
  • Familiarity with AWS, Azure, or GCP and IaC tools (CDK, Terraform)
  • Knowledge of SQL and NoSQL databases (PostgreSQL, Cassandra, DynamoDB)
  • Exposure to GenAI tools and LLM APIs (e.g., LangChain)
  • CI/CD and DevOps best practices
  • Strong communication and mentorship skills


Read more
Talent Pro
Bengaluru (Bangalore)
4 - 8 yrs
₹26L - ₹35L / yr
skill iconJava
skill iconSpring Boot
Google Cloud Platform (GCP)
Distributed Systems
Microservices
+3 more

Role & Responsibilities

Responsible for ensuring that the architecture and design of the platform remains top-notch with respect to scalability, availability, reliability and maintainability

Act as a key technical contributor as well as a hands-on contributing member of the team.

Own end-to-end availability and performance of features, driving rapid product innovation while ensuring a reliable service.

Working closely with the various stakeholders like Program Managers, Product Managers, Reliability and Continuity Engineering(RCE) team, QE team to estimate and execute features/tasks independently.

Maintain and drive tech backlog execution for non-functional requirements of the platform required to keep the platform resilient

Assist in release planning and prioritization based on technical feasibility and engineering constraints

A zeal to continually find new ways to improve architecture, design and ensure timely delivery and high quality.

Read more
Trika Tech
bhagya a
Posted by bhagya a
Bengaluru (Bangalore), Coimbatore
7 - 8 yrs
₹12L - ₹25L / yr
skill iconNodeJS (Node.js)
AWS Lambda
Apache Kafka
skill iconKubernetes

WHO WE ARE


We are a team of digital practitioners with roots stretching back to the earliest days of online commerce, who dedicate themselves to serving our client companies.

We’ve seen the advancements first-hand over the last 25 years and believe our experiences allow us to innovate. Utilizing cutting-edge technology and providing bespoke, innovative services, we believe we can help you stay ahead of the curve.

We take a holistic view of digital strategy. Our approach to transformation is based on conscious Intent to delight customers through continuous Insight and creative Innovation with an enduring culture of problem-solving.

We bring every element together to create innovative, high-performing commerce experiences for enterprise B2C, B2B, D2C and Marketplace brands across industries. From mapping out business and functional requirements, to developing the infrastructure to optimize traditionally fragmented processes, we help you create integrated, future-proofed commerce solutions.

 

WHAT YOU’LL BE DOING

As part of our team, you'll play a key role in building and evolving our Integration Platform as a Service (iPaaS) solution. This platform empowers our clients to seamlessly connect systems, automate workflows, and scale integrations with modern cloud-native tools.

Here’s what your day-to-day will look like:

  • Designing and Building Integrations
  • Collaborate with clients to understand integration needs and build scalable, event-driven solutions using Apache Kafka, AWS Lambda, API Gateway, and EventBridge.

  • Cloud-Native Development
  • Develop and deploy microservices and serverless functions using TypeScript (Node.js), hosted on Kubernetes (EKS) and fully integrated with core AWS services like S3, SQS, and SNS.

  • Managing Data Pipelines
  • Build robust data flows and streaming pipelines using Kafka and NoSQL databases like MongoDB, ensuring high availability and fault tolerance.

  • Client Collaboration
  • Work directly with customers to gather requirements, design integration patterns, and provide guidance on best practices for cloud-native architectures.

  • Driving Platform Evolution
  • Contribute to the ongoing improvement of our iPaaS platform—enhancing observability, scaling capabilities, and CI/CD processes using modern DevOps practices.


WHAT WE NEED IN YOU


  • Solid Experience in Apache Kafka for data streaming and event-driven systems
  • Production experience with Kubernetes (EKS) and containerized deployments
  • Deep knowledge of AWS, including S3, EC2, SQS, SNS, EventBridge, Lambda
  • Proficient in TypeScript (Node.js environment)
  • Experience with MongoDB or other NoSQL databases
  • Familiarity with microservices architecture, async messaging, and DevOps practices
  • AWS Certification (e.g., Solutions Architect or Developer Associate) is a plus


Qualification

  • Graduate - BE / Btech or equivalent.
  • 5 to 8 years of experience
  • Self motivated and quick learner with excellent problem solving skills. 
  • A good team player with nice communication skills.
  • Energy and real passion to work in a startup environment.

Visit our website - https://www.trikatechnologies.com



Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort