11+ Message broker Jobs in Bangalore (Bengaluru) | Message broker Job openings in Bangalore (Bengaluru)
Apply to 11+ Message broker Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Message broker Job opportunities across top companies like Google, Amazon & Adobe.
Location: All Metros
Requirements: MB Design Experience, Experience in SOA/ Middleware Integration
Software Development Engineer II (App)
About the company:
At WizCommerce, we’re building the AI Operating System for Wholesale Distribution — transforming how manufacturers, wholesalers, and distributors sell, serve, and scale.
With a growing customer base across North America, WizCommerce helps B2B businesses move beyond disconnected systems and manual processes with an integrated, AI-powered platform.
Our platform brings together everything a wholesale business needs to sell smarter and faster. With WizCommerce, businesses can:
- Take orders easily — whether at a trade show, during customer visits, or online.
- Save hours of manual work by letting AI handle repetitive tasks like order entry or creating product content.
- Offer a modern shopping experience through their own branded online store.
- Access real-time insights on what’s selling, which customers to focus on, and where new opportunities lie.
The wholesale industry is at a turning point — outdated systems and offline workflows can no longer keep up. WizCommerce brings the speed, intelligence, and design quality of modern consumer experiences to the B2B world, helping companies operate more efficiently and profitably.
Backed by leading global investors including Peak XV Partners (formerly Sequoia Capital India), Z47 (formerly Matrix Partners), Blume Ventures, and Alpha Wave Global, we’re rapidly scaling and redefining how wholesale and distribution businesses sell and grow.
If you want to be part of a fast-growing team that’s disrupting a $20 trillion global industry, WizCommerce is the place to be.
Read more about us in Economic Times, The Morning Star, YourStory, or on our website!
Founders:
Divyaanshu Makkar (Co-founder, CEO)
Vikas Garg (Co-founder, CCO)
Job Description:
Role & Responsibilities:
- Architect and oversee the development of our mobile application using React Native, ensuring high-quality code, performance, and user experience.
- Collaborate closely with Product Managers, Designers, and other stakeholders to understand requirements and translate them into technical specifications and deliverables.
- Drive the end-to-end software development lifecycle, including planning, estimation, execution, and delivery of mobile app projects.
- Take ownership of technical decisions, code reviews, and ensure best practices are followed in the team.
- Provide technical mentorship to engineers, promoting their professional growth and skill development.
- Stay up-to-date with industry trends, emerging technologies, and best practices in mobile app development to drive continuous improvement and innovation.
Requirements:
- Solid experience in mobile app development using React Native, with a proven track record of successfully delivering high-quality mobile applications.
- 3-4 years of experience with mobile app development with React Native or web development with React
- Strong understanding of software development methodologies, architecture, and design patterns.
- Excellent verbal and written communication skills, with the ability to effectively interact with cross-functional teams and stakeholders.
- Strong problem-solving skills and the ability to make strategic and technical decisions.
- Experience with B2B SaaS products or similar industries is a plus.
Benefits:
- Opportunity to work in a fast-paced, growing B2B SaaS company.
- Collaborative and innovative work environment.
- Competitive salary and benefits package.
- Growth and professional development opportunities.
- Flexible working hours to accommodate your schedule.
Compensation: Best in the industry
Role location: Bengaluru/Gurugram
Website Link: https://www.wizcommerce.com/
Key Responsibilities:
- Design, develop, and maintain robust and scalable backend applications using Core Java and Spring Boot.
- Build and manage microservices-based architectures and ensure smooth inter-service communication.
- Integrate and manage real-time data streaming using Apache Kafka.
- Write clean, maintainable, and efficient code following best practices.
- Collaborate with cross-functional teams including QA, DevOps, and product management.
- Participate in code reviews and provide constructive feedback.
- Troubleshoot, debug, and optimize applications for performance and scalability.
Required Skills:
- Strong knowledge of Core Java (Java 8 or above).
- Hands-on experience with Spring Boot and the Spring ecosystem (Spring MVC, Spring Data, Spring Security).
- Experience in designing and developing RESTful APIs.
- Solid understanding of Microservices architecture and related patterns.
- Practical experience with Apache Kafka for real-time data processing.
- Familiarity with SQL/NoSQL databases such as MySQL, PostgreSQL, or MongoDB.
- Good understanding of CI/CD tools and practices.
- Knowledge of containerization tools like Docker is a plus.
- Strong problem-solving skills and attention to detail.
Roles and Responsibilities :
Experience in Java Fundamentals, OOPS concepts, collections, designing spring boot micro services.
•Should have worked of Angular, NodeJS, React JS and REST services
•Designing, developing, and delivering high-volume, low-latency applications for mission-critical
systems.
•Contribute in all phases of the development lifecycle.
•Write well designed, testable, efficient code.
•Ensure designs follow specifications.
•Prepare and produce releases of software components.
•Candidate should have 5-8 years of experience is preferred
We are looking for "Sr.Software Engineer (API)" for Reputed Client @ Bangalore Permanent Role.
Experience: 5 - 7 Yrs
1.Roles & Responsibilities:
• Design, develop, and unit test applications in accordance with established standards.
• Preparing reports, manuals and other documentation on the status, operation and maintenance of software.
• Analyzing and resolving technical and application problems
• Adhering to high-quality development principles while delivering solutions on-time and on-budget
• Providing third-level support to business users.
• Compliance of process and quality management standards
• Understanding and implementation of SDLC process
2.Technical Skills:
Primary Skills
• 2+ years of experience in Web API, C#.Net, OOPS and Entity framework
• Minimum 1+ years of experience on Web Application development HTML, CSS, JavaScript/JQuery, Entity framework and Linq Queries
• Must have a good exposure on query writing and DB management for writing stored procedures/ user-defined functions
Secondary Skills
• Experience in analysing existing code and debugging.
• Should have a solid understanding of the SDLC processes (Design, Construction, Testing, Deployment)
• Proven experience of delivering on-time and with quality
• Should have good unit testing skills to review his own development and identify all the defects and get it fixed before releasing the code
Desired Skills
• Should have experience in developing ERP applications or Database Intensive Data Entry applications.
• Been on a same role for a period of 2 years or more
• Hands on experience of configuration management and version maintenance
• Prior experience of working in the shipping domain
• Should be able to understand the Functional Specifications and Technical Specifications and develop the application as per the specification provided.
• Full stack developers with experience in Developing products with Angular and C# APIs, from ground-up will be a plus.
3.Abilty:
• Excellent Communication Skills
• Good conceptualization skills
• Strong Visualization ability
• Analytical ability
• Quality Oriented
• Problem solving
4.Educational Qualification:
BE/B-Tech/MCA/ M.Tech
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
WPF, C#.Net, windows based applications.
Experience TFS file compare via code.
Experience with .tt files for auto-code generation.
Job Description:
The primary responsibility of a software engineer is to produce high quality code using C#.NET and be part of an agile development team, using SCRUM methods. The responsiblities could include Tools Development for process improvement, or some tool related to the Product Code Development.
Responsibilities
•Contribute to the implementation of User Stories
•Participate in SCRUM process, be agile
•Fulfill the acceptance criteria with the contributed code (quality,
unit tests, metrics, etc.)
•Adherence to architecture, design and qualityguidelines
•Maintenance, problem analysis and bug fixing within the product
•Technical clarifications with other team members
Regional Sales Manager
Company Profile
Ashnik is a leading enterprise open source solutions company in Southeast Asia and India, enabling organizations to adopt open source for their digital transformation goals. Founded in 2009, it offers a full-fledged Open Source Marketplace, Solutions, and Services – Consulting, Managed, Technical, Training. Over 200 leading enterprises so far have leveraged Ashnik’s offerings in the space of Database platforms, DevOps & Microservices, Kubernetes, Cloud, and Analytics.
As a team culture, Ashnik is a family for its team members. Each member brings in different perspective, new ideas and diverse background. Yet we all together strive for one goal – to deliver best solutions to our customers using open-source software. We passionately believe in the power of collaboration. Through an open platform of idea exchange, we create vibrant environment for growth and excellence.
Responsibilities
- Create strong relationships with key client stakeholders at both senior and mid-
management levels - Create & articulate compelling value propositions around the use of Open-source technology
- Understand Ashnik’s capabilities & services and effectively communicate offerings to the customers
- Create, implement and own a sales pipeline to manage customer lead intake, outbound activity, prioritization and metrics for measurement of deal status
- Understand the competitive landscape and market trends
- Establish sales objectives by forecasting and developing annual sales quotas for regions and territories; projecting expected sales volume and profit for existing and new products/technologies
- Desire to own projects and exceed expectations, with ability to find solutions and deliver results within a rapidly changing, entrepreneurial, technology-driven culture
- Ability to identify and solve client issues strategically
- Excellent interpersonal skills, with the ability to communicate effectively with management and cross-functional teams, for both technical and non-technical audiences
- Work with the Sales, pre-sales, technical and Operations, teams to implement targeted sales strategy
- Generate and maintain accurate Account and Opportunity plans
- Work with internal teams on behalf of clients to ensure the highest level of customer service
- Interface with technical support internally to resolve issues that directly impact partners
- Establish high levels of quality, accuracy and process consistency for the sales team
- Reporting and analytics
- Ensure reports and other internal intelligence and insight is provided to the sales team
- Keen business sense, with the ability to find creative business-oriented solutions to problems.
Qualification and Skills
- Graduate with minimum 8-10 years of experience in selling IT software products to large enterprises in South India
- Strong skills in enterprise sales cycle
- Familiarity with open source software is highly desirable
- Strong communication and presentation skills
Location: Bangalore
Package: upto 20L
- Very strong knowledge in Embedded C including
- Data structures
- Function pointers
- Bitwise operations
- Experience with Finite State Machine (FSM) software design
- Required experience in AVR microcontrollers and projects to demonstrate it
- Familiarity with ARM 32 bit microcontrollers (Cortex M or any other)
- Required experience with common communication protocols like SPI, UART, I2C etc.
- Comfortable with reading datasheets and basic hardware schematics
- Comfortable with linux development environment including the use of GCC, gdb, makefile
Soft Skills
- Good communication skills
- Self-driven
- Passion for embedded engineering

IT solutions specialized in Apps Lifecycle management. (MG1)
- Ability to design the overall architecture of UI components of the web application.
- Maintain quality and ensure responsiveness of applications.
- Collaborate with the rest of the engineering team to design and launch new features.
- Maintain code integrity and organization.
- Experience working with graphic designers and converting designs to visual elements using JavaScript, HTML, and CSS.
- Understanding and implementation of security and data protection.
- Good problem-solving skills and willingness to learn new technology
Qualifications:-
- 2+ year track record of relevant work experience and a Computer Science or a related technical discipline is required
- Excellent knowledge of JavaScript
- Proficient experience in ReactJS.
- OOPS, Concept, and Data Structure like Queue, Stack, Tree, Linked List, etc.
- Good knowledge in working with REST APIs
- Knowledge of converting designs to working web pages using HTML and CSS.
- Experience in Architecture and delivery of web applications.
- Knowledge of code versioning tool – Git
Good to have:
- Experience in AngularJS
- Experience in server-side programming



