11+ IDX Jobs in Bangalore (Bengaluru) | IDX Job openings in Bangalore (Bengaluru)
Apply to 11+ IDX Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest IDX Job opportunities across top companies like Google, Amazon & Adobe.

Job Title : Senior Backend Engineer – Java, AI & Automation
Experience : 4+ Years
Location : Any Cognizant location (India)
Work Mode : Hybrid
Interview Rounds :
- Virtual
- Face-to-Face (In-person)
Job Description :
Join our Backend Engineering team to design and maintain services on the Intuit Data Exchange (IDX) platform.
You'll work on scalable backend systems powering millions of daily transactions across Intuit products.
Key Qualifications :
- 4+ years of backend development experience.
- Strong in Java, Spring framework.
- Experience with microservices, databases, and web applications.
- Proficient in AWS and cloud-based systems.
- Exposure to AI and automation tools (Workato preferred).
- Python development experience.
- Strong communication skills.
- Comfortable with occasional US shift overlap.
SenSight Technologies is a company working on innovative solutions in the intersection of Internet of Things and Big Data Analytics. Our solution, AutoWiz is a Platform-as-a-Service that enables insightful connected vehicle experiences. AutoWiz Platform is a scalable and versatile vehicle data analytics platform for companies in Automotive, Mobility, Motor Insurance and Logistics domain to offer differentiated solutions based on vehicle generated data.
Based on AutoWiz Platform, We offer Telematics and mobility solutions and Apps. AutoWiz connects vehicles to the AutoWiz cloud where AutoWiz develops insights that lead to better ownership experience and decisions across lifecycle of vehicles
See more information at http://www.autowiz.in" target="_blank">www.autowiz.in
As a Platform Engineer you will be working on high performance Connected Car Data analytics and Telematics platform and related applications
Responsibilities and Requirements
- You will be part of an highly skilled cross functional team that is optimising existing systems as well as designing and developing new products and features in the area of Internet of things.
- Design patterns in Java, Core Java 8, Spring Boot framework, Micro Service Architecture
- Experience with RDBMS (preferably MySQL)
- Experience in implementing JMS messaging services
- Good verbal and written communication. Excellent team player, ability to follow through on deadlines.
Essential qualifications
- Master or Bachelor degree in Computer Science or a related field from an accredited university with high marks.
- 2+ years experience as Java developer, especially for scalable and real-time computing systems

Job Description
Title - Lead Snowflake Developer
Location - Chennai/Hyderabad/Bangalore
Role - Fulltime
Notice Period/Availability - Immediate
Years of Experience - 6+
Job Description:
- Overall 6 years of experience in IT/Software development
- Minimum 3 years of experience working with Snowflake.
- Designing, implementing and testing cloud computing solutions using Snowflake technology.
- Creating, monitoring and optimization of ETL/ELT processes.
- Migrating solutions from on-premises to public cloud platforms.
- Experience in SQL language and data warehousing concepts.
- Experience in Cloud technologies: AWS, Azure or GCP.
Required Skills:
- 2+ year of experience in Development in JAVA technology.
- Strong Java Basics
- SpringBoot or Spring MVC
- Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
- Proficient in REST API development
- Messaging Queue (RabitMQ or Kafka)
- Microservices
- Any Caching Mechanism
- Good at problem solving
Good to Have Skills:
- 4+ years of experience in using Java/J2EE tech stacks
- Good understanding of data structures and algorithms.
- Excellent analytical and problem solving skills.
- Ability to work in a fast paced internet start-up environment.
- Experience in technical mentorship/coaching is highly desirable.
- Understanding of AI/ML algorithms is a plus.
👋🏼We're Nagarro.
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (19000+ experts across 33 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in!
REQUIREMENTS:
- Must have Skills: Node JS, Microservices architecture (MSA) (Capable),Javascript/Typescript, AWS S3/ LAMBDA/ EC2/DYNAMODB
- Cloud developer with a minimum of 6 years of experience Technical skills in most of the following areas
- Expert in developing REST Services and APIS using JavaScript, TypeScript NodeJS is a must
- Implementing Serverless Software development using Amazon Lambda written in NodeJS is desirable.
- Services including AWS IoT, S3, RDS, IAM, Cognito, SNS, VPC, EBS, AWS Route 53, and Cloud Watch is a must
- Proficient in database concepts including schema design, querying, performance tuning, and debugging (Both SQL and No-SQL databases)
- Strong experience in designing and developing enterprise cloud-based IoT solutions using AWS
- Develop Web, APIs, IoT & Cloud based solutions is a must
- Ability to write automated unit test cases using mocking frameworks (Jasmine, Mocha, chai, Jest, etc.)
- Experience in modern data architectures (e.g. Micro Service, event-driven architectures, stream processing, and integrating real-time analytics into customer applications is desirable
- Have a good eye for NFRs (Scalability, extensibility, reliability, etc.) while evaluating design. should be able to Converts Solution Requirements into logical systems & subsystems
- Innovates and creates new ideas and have the ability to develop quick proof of concepts and validate with the customer.
RESPONSIBILITIES:
- Understanding functional requirements thoroughly and analysing the client’s needs in the context of the project
- Envisioning the overall solution for defined functional and non-functional requirements, and being able to define technologies, patterns, and frameworks to realize it
- Determining and implementing design methodologies and tool sets
- Enabling application development by coordinating requirements, schedules, and activities.
- Being able to lead/support UAT and production roll outs
- Creating, understanding, and validating WBS and estimated effort for given module/task, and being able to justify it
- Addressing issues promptly, responding positively to setbacks and challenges with a mindset of continuous improvement
- Giving constructive feedback to the team members and setting clear expectations.
- Helping the team in troubleshooting and resolving of complex bugs
- Carrying out POCs to make sure that suggested design/technologies meet the requirements.

Have you streamed a program on Disney+, watched your favorite binge-worthy series on Peacock or cheered your favorite team on during the World Cup from one of the 20 top streaming platforms around the globe? If the answer is yes, you’ve already benefitted from Conviva technology, helping the world’s leading streaming publishers deliver exceptional streaming experiences and grow their businesses.
Conviva is the only global streaming analytics platform for big data that collects, standardizes, and puts trillions of cross-screen, streaming data points in context, in real time. The Conviva platform provides comprehensive, continuous, census-level measurement through real-time, server side sessionization at unprecedented scale. If this sounds important, it is! We measure a global footprint of more than 500 million unique viewers in 180 countries watching 220 billion streams per year across 3 billion applications streaming on devices. With Conviva, customers get a unique level of actionability and scale from continuous streaming measurement insights and benchmarking across every stream, every screen, every second.
As Conviva is expanding, we are building products providing deep insights into end user experience for our customers.
Platform and TLB Team
The vision for the TLB team is to build data processing software that works on terabytes of streaming data in real time. Engineer the next-gen Spark-like system for in-memory computation of large time-series dataset’s – both Spark-like backend infra and library based programming model. Build horizontally and vertically scalable system that analyses trillions of events per day within sub second latencies. Utilize the latest and greatest of big data technologies to build solutions for use-cases across multiple verticals. Lead technology innovation and advancement that will have big business impact for years to come. Be part of a worldwide team building software using the latest technologies and the best of software development tools and processes.
What You’ll Do
This is an individual contributor position. Expectations will be on the below lines:
- Design, build and maintain the stream processing, and time-series analysis system which is at the heart of Conviva's products
- Responsible for the architecture of the Conviva platform
- Build features, enhancements, new services, and bug fixing in Scala and Java on a Jenkins-based pipeline to be deployed as Docker containers on Kubernetes
- Own the entire lifecycle of your microservice including early specs, design, technology choice, development, unit-testing, integration-testing, documentation, deployment, troubleshooting, enhancements etc.
- Lead a team to develop a feature or parts of the product
- Adhere to the Agile model of software development to plan, estimate, and ship per business priority
What you need to succeed
- 9+ years of work experience in software development of data processing products.
- Engineering degree in software or equivalent from a premier institute.
- Excellent knowledge of fundamentals of Computer Science like algorithms and data structures. Hands-on with functional programming and know-how of its concepts
- Excellent programming and debugging skills on the JVM. Proficient in writing code in Scala/Java/Rust/Haskell/Erlang that is reliable, maintainable, secure, and performant
- Experience with big data technologies like Spark, Flink, Kafka, Druid, HDFS, etc.
- Deep understanding of distributed systems concepts and scalability challenges including multi-threading, concurrency, sharding, partitioning, etc.
- Experience/knowledge of Akka/Lagom framework and/or stream processing technologies like RxJava or Project Reactor will be a big plus. Knowledge of design patterns like event-streaming, CQRS and DDD to build large microservice architectures will be a big plus
- Excellent communication skills. Willingness to work under pressure. Hunger to learn and succeed. Comfortable with ambiguity. Comfortable with complexity
Underpinning the Conviva platform is a rich history of innovation. More than 60 patents represent award-winning technologies and standards, including first-of-its kind-innovations like time-state analytics and AI-automated data modeling, that surfaces actionable insights. By understanding real-world human experiences and having the ability to act within seconds of observation, our customers can solve business-critical issues and focus on growing their businesses ahead of the competition. Examples of the brands Conviva has helped fuel streaming growth for include DAZN, Disney+, HBO, Hulu, NBCUniversal, Paramount+, Peacock, Sky, Sling TV, Univision, and Warner Bros Discovery.
Privately held, Conviva is headquartered in Silicon Valley, California with offices and people around the globe. For more information, visit us at www.conviva.com. Join us to help extend our leadership position in big data streaming analytics to new audiences and markets!
- - 5+ Years working experience as Java Development
- - Hands on using Microservices
- - Extensive experience on Spring boot / spring
- Primary Skill: Java, Microservices
-
-

Primary Responsibilities
- Design, architect and develop advanced software solutions in a cross functional Agile team supporting multiple projects and initiatives
- Collaborate with product owners and/or the business on requirements definition, development of functional specifications, and design
- Collaborate on or lead development of technical design and specifications as required
- Code, test and document new applications as well as changes to existing system functionality and ensure successful completion
- Take on leadership roles as needed
Skills & Requirements
- Bachelor’s Degree required, preferably in Computer Science or related field
- 3+ years of software development experience using GoLang/Java programming language
- Experience with cloud technologies (AWS/Azure/GCP/Pivotal Cloud Foundry/any private cloud) and containerization is required
- Experience with a micro-services architecture is a plus
- Excellent communication, collaboration, reporting, analytical and problem solving skills
- Experience with PostgreSQL or other Relational Databases
- Test-driven development mindset and a focus on quality, scalability and performance
- Strong programming fundamentals and ability to produce high quality code
- Solid understanding of Agile (SCRUM) Development Process required
• Work with product team to understand product vision and requirements
• Solve complex technical problems and perform the code reviews for junior team members.
• Produce deliverables at a consistently high rate and with consistently excellent quality
• Work with a team of engineering professionals to ensure the highest quality product delivery
As a member of our team, you will be responsible for insuring the successful launch of many product features.
Key responsibilities
• Work in a product based R&D team and collaborate with other teams to integrate.
• Write code that is of high quality and consistent with our coding standards
• Analyze highly complex business requirements, break the requirements to multiple applications; generate technical
specifications to design or redesign complex software components and applications
• Maintain best practices for development/code changes as needed
• Design and architect software applications
• Conducting code reviews and enforcing the quality standards
• Conducting the daily SCRUM meetings and removing the roadblocks
• Performance testing and tuning for scalability
• Develop quick proof of concepts to set the technical direction for rest of the team.
• Work with Devops and Automation team to develop automation strategy for your application.
Requirements
• Bachelor’s Degree (Masters preferred) in Computer Science or related field
• 3+ years of software development experience on web applications
• Experience in working in an onsite and offshore development model
• Must have hands-on design and development experience in Cloud (GCP/AWS/Azure), Kubernetes, Microservices,
Java, J2EE, Spring/Boot, Hibernate, JUnit and Integration with front end via rest interfaces.
• Must have Hands-on experience in Multi-threading, Non-blocking IO, Kafka, JMS
• Strong integration background required. Experience with Microservices, REST, JSON and APIs is required
• Experience with as many of the following is highly desirable: Tomcat, Node.js, XML, XSLT, XPath, Web Services,
MongoDB, MYSQL, and query performance tuning
• Experience with code management and continuous integration techniques and tools such as Maven, Gradle, Github,
JMeter, Jenkins, NPM etc. is highly desirable
• Experience building complex software systems that have been successfully delivered to customers
• Strong Computer Science fundamentals and working knowledge in data structures, algorithms, problem-solving and
complexity analysis
• Knowledge of professional software engineering practices and best practices for the full software development life
cycle (SDLC), including coding standards, code reviews, source control management, build processes, testing, and
operations
• Proven ability to troubleshoot issues in production including root cause analysis
• Self-directed and capable of working effectively in a highly innovative and fast-paced environment
• Experience with Agile software development in a UNIX/Linux environment
• Experience with system architecture/design
Qualifications
• Passionate about technology and technical challenges of all types excite you
• Eagerness to learn and learn fast, enjoy working in a fast-paced environment
• Ability to develop detailed design and deliver a scalable implementation.
• Mentor developers in analysis, design, coding and unit testing techniques
• Motivated self-starter and team player; you inspire others to achieve great things
• Driven to provide the best customer experience via technology
• Supply chain industry experience is preferred
• Proven ability to work effectively in a cross-functional team
• Strong problem solving and troubleshooting skills with the ability to come up with creative solutions for different
problems
• Strong written/spoken communication skills
• Experience with distributed systems operating in a scalable/high volume environment
• Ability to drive innovation



Data Platform engineering at Uber is looking for a strong Technical Lead (Level 5a Engineer) who has built high quality platforms and services that can operate at scale. 5a Engineer at Uber exhibits following qualities:
- Demonstrate tech expertise › Demonstrate technical skills to go very deep or broad in solving classes of problems or creating broadly leverageable solutions.
- Execute large scale projects › Define, plan and execute complex and impactful projects. You communicate the vision to peers and stakeholders.
- Collaborate across teams › Domain resource to engineers outside your team and help them leverage the right solutions. Facilitate technical discussions and drive to a consensus.
- Coach engineers › Coach and mentor less experienced engineers and deeply invest in their learning and success. You give and solicit feedback, both positive and negative, to others you work with to help improve the entire team.
- Tech leadership › Lead the effort to define the best practices in your immediate team, and help the broader organization establish better technical or business processes.
What You’ll Do
- Build a scalable, reliable, operable and performant data analytics platform for Uber’s engineers, data scientists, products and operations teams.
- Work alongside the pioneers of big data systems such as Hive, Yarn, Spark, Presto, Kafka, Flink to build out a highly reliable, performant, easy to use software system for Uber’s planet scale of data.
- Become proficient of multi-tenancy, resource isolation, abuse prevention, self-serve debuggability aspects of a high performant, large scale, service while building these capabilities for Uber's engineers and operation folks.
What You’ll Need
- 7+ years experience in building large scale products, distributed systems in a high caliber environment.
- Architecture: Identify and solve major architectural problems by going deep in your field or broad across different teams. Extend, improve, or, when needed, build solutions to address architectural gaps or technical debt.
- Software Engineering/Programming: Create frameworks and abstractions that are reliable and reusable. advanced knowledge of at least one programming language, and are happy to learn more. Our core languages are Java, Python, Go, and Scala.
- Platform Engineering: Solid understanding of distributed systems and operating systems fundamentals such as concurrency, multithreading, file systems, locking etc.
- Execution & Results: You tackle large technical projects/problems that are not clearly defined. You anticipate roadblocks and have strategies to de-risk timelines. You orchestrate work that spans multiple teams and keep your stakeholders informed.
- A team player: You believe that you can achieve more on a team that the whole is greater than the sum of its parts. You rely on others’ candid feedback for continuous improvement.
- Business acumen: You understand requirements beyond the written word. Whether you’re working on an API used by other developers, an internal tool consumed by our operation teams, or a feature used by millions of customers, your attention to details leads to a delightful user experience.

We, the Products team at DataWeave, build data products that provide timely insights that are readily consumable and actionable, at scale. Our underpinnings are: scale, impact, engagement, and visibility. We help
businesses take data driven decisions everyday. We also give them insights for long term strategy. We are focused on creating value for our customers and help them succeed.
How we work
It's hard to tell what we love more, problems or solutions! Every day, we choose to address some of the hardest data problems that there are. We are in the business of making sense of messy public data on the web. At
serious scale! Read more on Become a DataWeaver
What do we offer?
- Opportunity to work on some of the most compelling data products that we are building for online retailers and brands.
- Ability to see the impact of your work and the value you are adding to our customers almost immediately.
- Opportunity to work on a variety of challenging problems and technologies to figure out what really excites you.
- A culture of openness. Fun work environment. A flat hierarchy. Organization wide visibility. Flexible working hours.
- Learning opportunities with courses, trainings, and tech conferences. Mentorship from seniors in the team.
- Last but not the least, competitive salary packages and fast paced growth opportunities.
Roles and Responsibilities:
● Build a low latency serving layer that powers DataWeave's Dashboards, Reports, and Analytics
functionality
● Build robust RESTful APIs that serve data and insights to DataWeave and other products
● Design user interaction workflows on our products and integrating them with data APIs
● Help stabilize and scale our existing systems. Help design the next generation systems.
● Scale our back end data and analytics pipeline to handle increasingly large amounts of data.
● Work closely with the Head of Products and UX designers to understand the product vision and design
philosophy
● Lead/be a part of all major tech decisions. Bring in best practices. Mentor younger team members and
interns.
● Constantly think scale, think automation. Measure everything. Optimize proactively.
● Be a tech thought leader. Add passion and vibrancy to the team. Push the envelope.
Skills and Requirements:
● 5-7 years of experience building and scaling APIs and web applications.
● Experience building and managing large scale data/analytics systems.
● Have a strong grasp of CS fundamentals and excellent problem solving abilities. Have a good understanding of software design principles and architectural best practices.
● Be passionate about writing code and have experience coding in multiple languages, including at least one scripting language, preferably Python.
● Be able to argue convincingly why feature X of language Y rocks/sucks, or why a certain design decision is right/wrong, and so on.
● Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
● Have experience working with multiple storage and indexing technologies such as MySQL, Redis, MongoDB, Cassandra, Elastic.
● Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ.
● Use the command line like a pro. Be proficient in Git and other essential software development tools.
● Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
● Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana, Graylog, StatsD, Datadog etc.
● Working knowledge of building websites and apps. Good understanding of integration complexities and dependencies.
● Working knowledge linux server administration as well as the AWS ecosystem is desirable.
● It's a huge bonus if you have some personal projects (including open source contributions) that you work on during your spare time. Show off some of your projects you have hosted on GitHub.