11+ IBM Rational ClearQuest Jobs in Bangalore (Bengaluru) | IBM Rational ClearQuest Job openings in Bangalore (Bengaluru)
Apply to 11+ IBM Rational ClearQuest Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest IBM Rational ClearQuest Job opportunities across top companies like Google, Amazon & Adobe.
|
o Expertise in Software configuration management (SCM) using multi-site Rational ClearCase, ClearQuest ,Git, Gerrit, JIRA and CVS |
|
o Experience of working with continuous integration frameworks (like Hudson, Jenkins) and debugging |
|
o Expertise in build and release process and automation, build infrastructure setup |
|
o Build environment and build scripts (cmake / make, GNU Make) in-depth expertise and knowledge |
|
o Expertise and in-depth knowledge in scripting (Shell / Perl) is a must |
|
o Experience of working in UNIX-like environments (Linux/Solaris) and usage of operating system tools |
|
o Expertise in performing integration builds and release management |
|
o Experience of managing software licenses for different build tools |
|
o Exposure to Android , Open source development and code analysing tools like Blackduck and Coverity are added advantages |
Job Description for Automation QA
Key Responsibilities
● Test web and mobile applications / services, ensuring they meet high-quality standards. ● Conduct thorough testing of e-commerce platforms in the automobile domain (e.g., carwale.com, cars24.com). ● Perform backend REST API testing, ensuring correct data in databases and debugging issues through logs, network responses, and database validations. ● Collaborate with cross-functional teams (developers, product managers, DevOps) to define and execute comprehensive test plans and strategies. ● Analyze and debug integration workflows, particularly with third-party services such as payment gateways and authentication providers. ● Ensure exceptional frontend UI/UX quality with meticulous attention to detail. ● Write, execute, and maintain detailed test cases based on user stories and business requirements. ● Conduct regression, integration, and user acceptance testing (UAT) to validate product functionality. ● Monitor and analyze test results, report defects, and collaborate with developers for resolution. ● Use tools such as Postman, browser developer tools, and bug-tracking systems like JIRA effectively. ● Coordinate testing activities across multiple releases and environments. ● Facilitate test preparation, execution, and reporting while ensuring alignment with Agile frameworks. ● Maintain and update test documentation following requirement changes. ● Participate in daily stand-ups and sprint planning discussions, contributing to feature validation and delivery goals. ● Monitor and triage issues in collaboration with cross-functional teams to resolve them efficiently.
Required Skills & Qualifications
● 3+ years of experience in automation testing with hands-on exposure on web and backend testing, preferably in the e-commerce/automobile industry. ● Strong proficiency in testing tools like Postman, browser developer tools, and bug-tracking systems. ● Solid understanding of SQL, PostgreSQL, Python or MongoDB for data verification. ● Familiarity with async communication in service (e.g., AWS SQS, Apache Kafka) and debugging issues therein. ● Excellent knowledge of the software testing lifecycle (STLC) and Agile testing methodologies. ● Experience with version control systems like Git. ● Proven ability to debug issues in API integrations, logs, and databases. ● Strong communication and documentation skills for reporting bugs and preparing detailed test reports. ● Understanding of regression testing frameworks and expertise in functional and integration testing.
Additional Preferred Qualifications
● Experience with mobile testing frameworks and tools. ● Basic understanding of performance testing and debugging for optimized user experiences. ● Exposure to automation tools (not mandatory but advantageous).
Review Criteria
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred
- Preferred (Nice-to-have) – Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Job Specific Criteria
- CV Attachment is mandatory
- How many years of experience you have with Dremio?
- Which is your preferred job location (Mumbai / Bengaluru / Hyderabad / Gurgaon)?
- Are you okay with 3 Days WFO?
- Virtual Interview requires video to be on, are you okay with it?
Role & Responsibilities
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate
- Bachelor’s or master’s in computer science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Communication - Excellent communication, presentation, and negotiation skills
Role: Application Developer Role Description: Design, build and configure applications to meet business process and application requirements.
Must have Skills: Oracle Project Portfolio Management Cloud, Oracle Fusion PPM Cloud, Project Fundamentals Cloud
Good to Have Skills: Oracle Financials, Grants
Job Requirements: 1: Responsibilities:
a: All tasks related to configurations, data import, data export and
customizations of Oracle PPM Cloud Project Fundamentals, Projects Costing, Projects Billing, Project Accounting
b: Consultant is responsible for gathering and analyzing business requirements.
c: Consultant may have to interact with the clients to better understand the product requirements or in case the design requires any kind of modifications.
d: Experience in analysis of source data in data migrations and report the same and able to create test plans based on analysis.
e: Experience and demonstrated proficiency with QA methodology, process, and deliverables.
2: Professional Experience:
a: Must have 4-7 years Cloud PPM experience/2 to 3 end to end implementation experience.
b: Should be strong on Oracle Project Accounting Project Costing, Project Billing Project Management under cloud environment.
c: Strong understanding of Projects functional flows
Good knowledge of PL/SQL for data querying & analysis
Job Responsibilities :
- Lead design projects and contribute to product strategy
- Plan and design memorable user interfaces that are elegant and usable across our iOS, Android, and web platforms.
- Deliver wireframes and prototypes to help people visualize design ideas.
- Push the bounds of creativity using the latest design techniques and languages.
- Good knowledge of graphic/visual design
- Work closely with the Engineering team to get crazy ideas to life.
Requirements :
- Proficiency in Sketch (Component system), Adobe XD, Illustrator or any other tool that helps you churn out great design!
- Updated portfolio.
- People who can solve problems & create engaging experiences.
Position: Learning & Development Partnerships Management Specialist
Location: Bangalore onsite
Employment Type: Full-Time
Openings: 1
Role Summary:
We are seeking a dynamic and proactive specialist to lead our partnerships within the Learning & Development (L&D) ecosystem. This role involves building and managing strategic collaborations with freelance trainers, training vendors, LMS platforms, and other related partners to support the delivery of high-quality training programs globally.
Key Responsibilities:
- Expand and maintain a global pool of freelance trainers across various domains and geographies.
- Identify, engage, and manage relationships with training partners for co-delivery and service collaboration.
- Negotiate contracts, onboarding, and fee structures with trainers and service providers.
- Source and coordinate with vendors for auxiliary requirements such as training venues, content providers, and technology solutions.
- Ensure alignment of partner capabilities with organizational training needs and quality standards.
- Maintain an up-to-date database of partners and vendors with regular performance reviews.
Qualifications:
- Minimum 4 years of experience in partnerships, vendor management, or trainer relationship roles—preferably within the L&D or training industry.
- Proven experience in sourcing, evaluating, and onboarding trainers and vendors.
- Strong negotiation and contract management skills.
- Familiarity with Learning & Development trends, delivery formats, and tools (LMS, virtual platforms, etc.).
- Excellent communication, coordination, and stakeholder management skills.
Key Skills:
- Trainer and vendor sourcing
- Contract negotiation
- Partnership management
- Knowledge of L&D and training operations
- Communication & relationship building
- Attention to detail & organizational skills
- Strategic thinking and problem-solving
Description:
As a Data Engineering Lead at Company, you will be at the forefront of shaping and managing our data infrastructure with a primary focus on Google Cloud Platform (GCP). You will lead a team of data engineers to design, develop, and maintain our data pipelines, ensuring data quality, scalability, and availability for critical business insights.
Key Responsibilities:
1. Team Leadership:
a. Lead and mentor a team of data engineers, providing guidance, coaching, and performance management.
b. Foster a culture of innovation, collaboration, and continuous learning within the team.
2. Data Pipeline Development (Google Cloud Focus):
a. Design, develop, and maintain scalable data pipelines on Google Cloud Platform (GCP) using services such as BigQuery, Dataflow, and Dataprep.
b. Implement best practices for data extraction, transformation, and loading (ETL) processes on GCP.
3. Data Architecture and Optimization:
a. Define and enforce data architecture standards, ensuring data is structured and organized efficiently.
b. Optimize data storage, processing, and retrieval for maximum
performance and cost-effectiveness on GCP.
4. Data Governance and Quality:
a. Establish data governance frameworks and policies to maintain data quality, consistency, and compliance with regulatory requirements. b. Implement data monitoring and alerting systems to proactively address data quality issues.
5. Cross-functional Collaboration:
a. Collaborate with data scientists, analysts, and other cross-functional teams to understand data requirements and deliver data solutions that drive business insights.
b. Participate in discussions regarding data strategy and provide technical expertise.
6. Documentation and Best Practices:
a. Create and maintain documentation for data engineering processes, standards, and best practices.
b. Stay up-to-date with industry trends and emerging technologies, making recommendations for improvements as needed.
Qualifications
● Bachelor's or Master's degree in Computer Science, Data Engineering, or related field.
● 5+ years of experience in data engineering, with a strong emphasis on Google Cloud Platform.
● Proficiency in Google Cloud services, including BigQuery, Dataflow, Dataprep, and Cloud Storage.
● Experience with data modeling, ETL processes, and data integration. ● Strong programming skills in languages like Python or Java.
● Excellent problem-solving and communication skills.
● Leadership experience and the ability to manage and mentor a team.
Booch is India’s first ‘non-alcoholic fermented’ drink that helps you to release your stress with without getting drunk or dizzy at your workplace, home, or anywhere you want to get relax. Booch has a unique flavour, low calories with a relaxation effect, making it irreplaceable with any other beverage.
It is led by alumni from top educational and professional organisations with the ambition of “Flying sober” in a world which is full of stress.
Roles & Responsibilities:
• Execute Marketing and Sales plans to achieve set business goals.
• Create strategies to build an immersive and engaging consumer with product experience at retail, brand events and offline community.
• Work with cross-functional teams to execute offline activities to increase brand awareness across Events, Retail shop and Community.
• Analyse sales data, campaign parameters and implement learning in future. Generate sales and marketing reports as per campaign.
• Contribute to building and executing sales and marketing strategies through on-ground pulse monitoring feedback bottom-up; competition scope research, platform determination, benchmarking, messaging and audience identification.
• Recommend & execute promotional activities in coordination with sales team.
• Ideate consumer connect initiatives to promote brand awareness and recall among consumers.
• Collaborate with the supply chain & sales ops teams to ensure consistency of consumer experience at offline activation.
• Undertake daily administrative tasks to ensure the functionality and coordination the team activities.
• Assist the creative team in designing and developing marketing collateral and promotional materials.
• Work closely with online teams who monitor forums and social media channels with an aim to conciliate and serve the customer’s order fulfilment needs on-ground.
Prerequisites:
• 1 - 2 years of experience in Marketing and or sales.
• Sound knowledge of sales and marketing strategies, channels, and branding for FMCG products.
• Good leadership, communication, and collaboration abilities.
• Strong time management and organizational abilities.
• Ability to travel as necessary.
• MBA from a reputed B- School is desirable but not essential
Industry Type: FMCG
Employment Type: Full Time
Role Category: Marketing & Sales
Location: HSR Bangalore
Salary Range: - Rs 25k to 35k/Month
Murf AI is a fast-growing Series A-funded startup backed by Elevation Capital and Matrix Partners India. Founded in 2020 by alumni of IIT Kharagpur, we have served 2 Mn+ voice-over projects through our core product, the Murf Studio, which now caters to customers in 100+ countries.
We are working on simplifying voice audio and making high-quality voiceovers accessible to everyone, using artificial intelligence. Murf helps users create lifelike voiceovers in a matter of minutes, without the need for any recording equipment. (https://murf.ai/).
Some interesting facts about Murf AI:
Customers in 100+ countries
1Mn+ registered users
7X growth in revenue in the last 12 months
120+ voices in 20+ languages offered by Murf Studio
Job Description
We are looking for a React.js developer who has a good knowledge of modern (ES6+) Javascript to build the next generation of Murf AI Studio’s frontend. You would be responsible for building a feature packed cloud based tool for generating and editing synthetic media. You will build user-facing components and implement them with different workflows and toolchests like Redux and Flux. Your primary focus will be to develop a stable, robust, aesthetic and maintainable product. You also need a good understanding of the toolchain of modern web development including packers and transpilers and apply these pragmatically to achieve high quality deliverables.
Responsibilities
- Create new features or parts of applications, with a natural ability to deliver on short timelines.
- Develop components and libraries that are reusable and future-proof
- Confident in using UI/UX designs or wireframes to create the respective code and the application
- Use your knowledge on React.js and its lifecycle to maximise components performance across different devices and browsers
- Work with design and content teams, to improve customer facing landing and resource pages.
Required Skills & Qualifications
- 1-3 years of experience in building complex react based applications
- Bachelor's degree in CS or similar fields
- Deep understanding of React.js and its fundamentals, ideally including modern features like hooks
- Excellent understanding of Javascript including the OOP concept and how shadow DOM and DOM work
- Experience with widely used React.js state managers like Flux, Redux, Thunks or Sagas
- Knowing modern ECMAScript
- Experience with data structure libraries like Immutable.js
- Knowledge of RESTful APIs
- Familiarity with modern front-end build pipelines and tools like Webpack, Babel, NPM etc.
- Knowledge of the overall browser rendering behaviour and measuring and optimising performance
- Strong hands-on experience with source code management systems like Git
- Understanding of what the business / stakeholders need and transfer that into your project
Extra Awesome
- Animation skills using CSS/SVG/JS
- Experience with AWS technologies such as Cloudfront, Lambda, S3.
- Understanding of authorisation mechanisms like OAuth, JSON Web Token
- Understanding of on-site Technical SEO and accessibility concepts
To ensure success as an iOS Developer, you should have a strong working knowledge of iOS Frameworks, be proficient in Objective-C, and be able to work as part of a team. Ultimately, an outstanding iOS Developer should be able to create functional, attractive applications that perfectly meet the needs of the user.
iOS Developer Responsibilities:
Designing and building mobile applications for Apple’s iOS platform.
Collaborating with the design team to define app features.
Ensuring quality and performance of application to specifications.
Identifying potential problems and resolving application bottlenecks.
Fixing application bugs before final release.
Publishing application on App Store.
Maintaining the code and atomization of the application.
Designing and implementing application updates.
iOS Developer Requirements:
Bachelor’s degree in Computer Science or Software Engineering.
Proven experience as an app developer.
Proficient in Objective-C, Swift, and Cocoa Touch.
Extensive experience with iOS Frameworks such as Core Data and Core Animation.
Knowledge of iOS back-end services.
Knowledge of Apple’s design principals and application interface guidelines.
Proficient in code versioning tools including Mercurial, Git, and SVN.
Knowledge of C-based libraries.
Familiarity with push notifications, APIs and cloud messaging.
Experience with continuous integration.
Job title: Azure Architect
Locations: Noida, Pune, Bangalore and Mumbai
Responsibilities:
- Develop and maintain scalable architecture, database design and data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity
- Design and Develop the Data lake, Data warehouse using Azure Cloud Services
- Assist in designing end to end data and Analytics solution architecture and perform POCs within Azure
- Drive the design, sizing, POC setup, etc. of Azure environments and related services for the use cases and the solutions
- Reviews the solution requirements support architecture design to ensure the selection of appropriate technology, efficient use of resources and integration of multiple systems and technology.
- Must possess good client-facing experience with the ability to facilitate requirements sessions and lead teams
- Support internal presentations to technical and business teams
- Provide technical guidance, mentoring and code review, design level technical best practices
Experience Needed:
- 12-15 years of industry experience and at least 3 years of experience in architect role is required along with at least 3 to 4 years’ experience designing and building analytics solutions in Azure.
- Experience in architecting data ingestion/integration frameworks capable of processing structured, semi-structured & unstructured data sets in batch & real-time
- Hands-on experience in the design of reporting schemas, data marts and development of reporting solutions
- Develop batch processing, streaming and integration solutions and process Structured and Non-Structured Data
- Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and Azure Analysis Services and other ETL technologies.
- Experience in Perform Design, Development & Deployment using Azure Services ( Azure Synapse, Data Factory, Azure Data Lake Storage, Databricks, Python and SSIS)
- Worked with transactional, temporal, time series, and structured and unstructured data.
- Deep understanding of the operational dependencies of applications, networks, systems, security, and policy (both on-premise and in the cloud; VMs, Networking, VPN (Express Route), Active Directory, Storage (Blob, etc.), Windows/Linux).
Mandatory Skills: Azure Synapse, Data Factory, Azure Data Lake Storage, Azure DW, Databricks, Python







