11+ Message broker Jobs in Bangalore (Bengaluru) | Message broker Job openings in Bangalore (Bengaluru)
Apply to 11+ Message broker Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Message broker Job opportunities across top companies like Google, Amazon & Adobe.
Location: All Metros
Requirements: MB Design Experience, Experience in SOA/ Middleware Integration
About the Role
As a Senior Angular Developer, you will take ownership of modules, guide developers, and drive high-performance UI engineering.
Key Responsibilities
• Own modules end-to-end
• Mentor mid-level developers
• Conduct code reviews
• Drive performance optimization
• Contribute to architecture decisions
Required Skills
• Angular, TypeScript
• RxJS
• NgRx / State Management
• Tailwind CSS
• REST API Integration
• Auth (JWT/custom)
• Unit Testing & E2E Testing
• Nx Monorepo (plus)
• UI/UX Understanding
• Performance Optimization
• Charting Libraries
Tools You Will Use
• GitLab
• Figma
• Postman
• VS Code
• Browser DevTools
• NPM
• Nx
• CI/CD
• Slack
Cultural Expectations
• Fast-paced execution
• Ownership mindset
• Clean code & engineering discipline
• Zero ego team culture
• Attention to detail
• Comfortable with ambiguity
• High accountability
• Passion for fintech
• Curiosity & constant learning
● Understand as a social product right from our top of the funnel, the free and the pro features for Teenagers
● Understand the target User personal
● Ideating on new program pathways and working with subject matter experts on building an active, project based learning curriculum
● Responsible for the management and organisation of all programs, across the free and paid offering
● Building data driven structures to understand the effectiveness of learning and engagement initiatives
● Coordinating with the Business, Marketing, Product and Operations teams to drive learning and engagement objectives
● Build a culture of continuous learning and constructive feedback within the team geared toward futuristic learning vision
Expectations and Skills
● Bring founder’s mindset to work everyday, high ownership and drive to make a dent, high ambition
● Super high creativity and innovation driven approach to learning
● Proven Expertise in setting up and scaling Masterclasses (Cohort based live courses) and/or Project based learning pedagogy
● Proven Experience (5+ years) in managing large specialized teams
● Understanding of a social product and the category is building and complete alignment with the mission
● Hands on leadership, we don’t want managers but amazing leaders who can drive a sports team like culture by being the Captain rather than the Coach
● Fast iterations and super fast learning temperament as a leader
● Ability to handle multiple tasks, channels, diverse global team members
● Very strong in setting processes, dashboards, workflows, tracers, project management tools to keep everyone sane as we scale!
● Deep user understanding and user driven approach rather than business focused only
Experience: 8-10yrs
Notice Period: max 15days
Must-haves*
1. Knowledge about Database/NoSQL DB hosting fundamentals (RDS multi-AZ, DynamoDB, MongoDB, and such)
2. Knowledge of different storage platforms on AWS (EBS, EFS, FSx) - mounting persistent volumes with Docker Containers
3. In-depth knowledge of Security principles on AWS (WAF, DDoS, Security Groups, NACL's, IAM groups, and SSO)
4. Knowledge on CI/CD platforms is required (Jenkins, GitHub actions, etc.) - Migration of AWS Code pipelines to GitHub actions
5. Knowledge of vast variety of AWS services (SNS, SES, SQS, Athena, Kinesis, S3, ECS, EKS, etc.) is required
6. Knowledge on Infrastructure as Code tool is required We use Cloudformation. (Terraform is a plus), ideally, we would like to migrate to Terraform from CloudFormation
7. Setting CloudWatch Alarms and SMS/Email Slack alerts.
8. Some Knowledge on configuring any kind of monitoring tool such as Prometheus, Dynatrace, etc. (We currently use Datadog, CloudWatch)
9. Experience with any CDN provider configurations (Cloudflare, Fastly, or CloudFront)
10. Experience with either Python or Go scripting language.
11. Experience with Git branching strategy
12. Containers hosting knowledge on both Windows and Linux
The below list is *Nice to Have*
1. Integration experience with Code Quality tools (SonarQube, NetSparker, etc) with CI/CD
2. Kubernetes
3. CDN's other than CloudFront (Cloudflare, Fastly, etc)
4. Collaboration with multiple teams
5. GitOps
- Design and develop product components and validate them for technical design,performance, and production readiness.
- Review product requirements, provide estimates, write unit test.
- Maintain, improve, and integrate existing components and applications.
- Works with support and consulting teams in resolving customer issues.
- Interacts with teams as required in understanding, reproducing, and troubleshooting customer issues.
- Work in an agile, rapid development and prototyping environment where effective
- communication is paramount.
Roles and Responsibilities
- Bachelors degree in computer science or equivalent with 3- 5 years of experience or
- Master’s degree in computer science or equivalent level of experience
- Highly proactive, result-oriented and team player.
- Required languages:
- Minimum 3 years of experience in Angular7+, Node JS, Java Script, JQuery, TypeScript, CSS, HTML, single page application frameworks
- Experience working with any Unit Test Framework is a must.
- Experience in advanced text processing using advanced regular expressions, data parsing.
- Expertise in debugging UI issues from the browser.
- Developing telemetry software to connect Junos devices to the cloud
- Fast prototyping and laying the SW foundation for product solutions
- Moving prototype solutions to a production cloud multitenant SaaS solution
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
- Build analytics tools that utilize the data pipeline to provide significant insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with partners including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics specialists to strive for greater functionality in our data systems.
Qualification and Desired Experiences
- Master in Computer Science, Electrical Engineering, Statistics, Applied Math or equivalent fields with strong mathematical background
- 5+ years experiences building data pipelines for data science-driven solutions
- Strong hands-on coding skills (preferably in Python) processing large-scale data set and developing machine learning model
- Familiar with one or more machine learning or statistical modeling tools such as Numpy, ScikitLearn, MLlib, Tensorflow
- Good team worker with excellent interpersonal skills written, verbal and presentation
- Create and maintain optimal data pipeline architecture,
- Assemble large, sophisticated data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Experience with AWS, S3, Flink, Spark, Kafka, Elastic Search
- Previous work in a start-up environment
- 3+ years experiences building data pipelines for data science-driven solutions
- Master in Computer Science, Electrical Engineering, Statistics, Applied Math or equivalent fields with strong mathematical background
- We are looking for a candidate with 9+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Strong hands-on coding skills (preferably in Python) processing large-scale data set and developing machine learning model
- Familiar with one or more machine learning or statistical modeling tools such as Numpy, ScikitLearn, MLlib, Tensorflow
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and find opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Proven understanding of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and interpersonal skills.
- Experience supporting and working with multi-functional teams in a multidimensional environment.
- Expertise in Database Administration on Production Servers with server configuration, performance tuning, and maintenance with outstanding troubleshooting capabilities in Cloud – AWS / Azure.
- Efficient in Architecting, Configuring, Maintaining, Monitoring, Troubleshooting SQL Servers.
- Strong understanding of SQL database structures, principles, and practice
- Strong experience handling Backups, Restores, Corruption, and Disaster Recovery scenarios.
- Experience in installing and configuring SQL Server databases of different versions.
- Experience in Performance Tuning and Optimization, backups, restores, recovery models. Writing & optimizing SQL statements, partitioning, clustering, HA, DR. Experience in DMV & DDL queries.
- Independently analyze, solve, and correct issues in real-time, providing problem resolution end-to-end.
- Skilled in working with large volumes of data and loading data from multiple sources.
- Knowledge of reporting and query tools and practices.
- Knowledge of indexes, index management, and statistics.
- Skilled in deploying database change scripts
- Scheduling database backups, taking full and Transaction log backups.
- Automate SQL server tasks (SQL jobs, Windows scripts, etc.)
- Exposure to MySQL will be an added advantage.
Soft skills:
- Mission led. Someone who buys into the mission of a company and the problem they are solving. The company is growing fast and have a tonne of things they need to achieve; they want people that are bought in for the long term and believe in what they’re doing.
- Pragmatic thinker. You have a structured approach to situations, if you do not know the answer, you’ll know how to find it out.
- Open to change. You are able to work in a fast-paced environment where priorities can change instantly, and you are able to adjust.
- Technologist. Someone that is passionate about what they do and experienced in server operating systems and application deployment, configuration, and maintenance.
- Communication and Presentations skills.
Tiger Analytics is a global AI & analytics consulting firm. With data and technology at the core of our solutions, we are solving some of the toughest problems out there. Our culture is modeled around expertise and mutual respect with a team first mindset. Working at Tiger, you’ll be at the heart of this AI revolution. You’ll work with teams that push the boundaries of what-is-possible and build solutions that energize and inspire.
We are headquartered in the Silicon Valley and have our delivery centres across the globe. The below role is for our Chennai or Bangalore office, or you can choose to work remotely.
About the Role:
As an Associate Director - Data Science at Tiger Analytics, you will lead data science aspects of endto-end client AI & analytics programs. Your role will be a combination of hands-on contribution, technical team management, and client interaction.
• Work closely with internal teams and client stakeholders to design analytical approaches to
solve business problems
• Develop and enhance a broad range of cutting-edge data analytics and machine learning
problems across a variety of industries.
• Work on various aspects of the ML ecosystem – model building, ML pipelines, logging &
versioning, documentation, scaling, deployment, monitoring and maintenance etc.
• Lead a team of data scientists and engineers to embed AI and analytics into the client
business decision processes.
Desired Skills:
• High level of proficiency in a structured programming language, e.g. Python, R.
• Experience designing data science solutions to business problems
• Deep understanding of ML algorithms for common use cases in both structured and
unstructured data ecosystems.
• Comfortable with large scale data processing and distributed computing
• Excellent written and verbal communication skills
• 10+ years exp of which 8 years of relevant data science experience including hands-on
programming.
Designation will be commensurate with expertise/experience. Compensation packages among the best in the industry.
Experience in .Net Framework , C# , WCF , ASP.Net ,
Well - versed with Javascript , Jquery , AJAX .
Good understanding of Razor , HTML , CSS
Designation - SDE II / III (3D team)
About Livspace
Livspace is India’s trusted interior design and renovation platform that connects interior designers, homeowners and vendors. For homeowners, Livspace is their one-stop destination for all things interiors. For interior designers and vendors, we’ve streamlined their workflow from design all the way to delivery through powerful and innovative technology.
We’re currently in nine Indian metro areas. We’ve made over 20,000 customers happy by delivering their dream homes to them. With over 3,500 interior designers on board, we’re the largest design community India has seen. We employ over 2000 passionate individuals who continue to grow and be a part of this exciting journey.
If you value autonomy, enjoy challenges, believe in getting things done and can work with minimal supervision, come join us.
Skills required (Must haves):
- Should have worked on Javascript for minimum 2 years. Frontend(Preferred) / Backend
- Should have knowledge of Web development
- Should have good logical thinking and able to code using best practices
- Should have knowledge of Angular JS or any MVC framework
Good to Have (Plus Points):
- Some knowledge of Python/java
- Some knowledge on Database (MySQL)
- Some knowledge of Three.js & 3D rendering tools in web-like blender, 3dsmax etc.
- Some knowledge of Geometry/mathematics knowledge
- Some functional knowledge of CSS Styling, though we are not looking for UI developer
What you will be working on:
- Active development of web based 3D Design tool which is used be interior designers to create awesome designs
- Creation of builder tool which helps in creating dynamic products (variable dimensions and finished) run time
- Development of rule engine which take care of ever growing product & design rules



