11+ Taleo Jobs in Bangalore (Bengaluru) | Taleo Job openings in Bangalore (Bengaluru)
Apply to 11+ Taleo Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Taleo Job opportunities across top companies like Google, Amazon & Adobe.
Job Title: Taleo Techno Functional Consultant
Primary Module: Taleo Recruiting
Job Summary:
We are looking for an experienced Taleo Techno Functional Consultant with strong expertise in Taleo Recruiting, configuration, integrations, and automation. The ideal candidate should have hands-on experience in TCC scripting, OLF structure, Dynamic Approval Routing, onboarding transitions, and security configurations across Taleo modules.
Key Responsibilities:
1. Technical Configuration & Integration
- Develop and manage TCC (Taleo Connect Client) scripts.
- Configure Net Change operations and complex Import/Export filters.
- Work with Requisition and Candidate file structures.
- Support integrations and data migration activities.
2. OLF & Contextualization
- Design and maintain OLF (Organization, Location, Job Field) structure.
- Configure contextualization based on OLF.
- Manage SmartOrg, User Types, and access control.
- Implement security configurations aligned with OLF structure.
3. Approvals & Automation
- Configure Dynamic Approval Routing (DAR) and DAR matrix.
- Manage Offer Approval workflows.
- Set up automated tasks, onboarding triggers, and candidate progression rules.
4. Taleo Recruiting Functional Expertise
- Manage requisition creation, templates, and approvals.
- Handle candidate lifecycle and CSW (Candidate Selection Workflow).
- Configure Career Sections (Internal & External).
- Design Offer Letter templates and List Formats.
- Configure and manage UDFs (User-Defined Fields).
- Implement data purge rules and compliance requirements.
5. Onboarding & Transitions
- Configure onboarding types and transitions processes.
- Manage task orchestration and workflow automation.
6. Taleo Social Sourcing (TSS)
- Setup and configure TSS module.
- Manage sourcing workflows, user roles, and engagement processes.
7. Security & Access Management
- Configure User Types, permissions, and role-based access.
- Manage cross-module security integrations (Recruiting, Onboarding, TSS).
- Maintain SmartOrg access configurations.
Required Skills:
- Strong experience in Taleo Recruiting (Techno-Functional).
- Hands-on expertise in TCC, DAR, OLF structure, SmartOrg.
- Experience in onboarding transitions and workflow automation.
- Good understanding of data compliance and purge rules.
- Strong stakeholder communication skills.
Experience:
- 5
Position: AWS Data Engineer
Experience: 5 to 7 Years
Location: Bengaluru, Pune, Chennai, Mumbai, Gurugram
Work Mode: Hybrid (3 days work from office per week)
Employment Type: Full-time
About the Role:
We are seeking a highly skilled and motivated AWS Data Engineer with 5–7 years of experience in building and optimizing data pipelines, architectures, and data sets. The ideal candidate will have strong experience with AWS services including Glue, Athena, Redshift, Lambda, DMS, RDS, and CloudFormation. You will be responsible for managing the full data lifecycle from ingestion to transformation and storage, ensuring efficiency and performance.
Key Responsibilities:
- Design, develop, and optimize scalable ETL pipelines using AWS Glue, Python/PySpark, and SQL.
- Work extensively with AWS services such as Glue, Athena, Lambda, DMS, RDS, Redshift, CloudFormation, and other serverless technologies.
- Implement and manage data lake and warehouse solutions using AWS Redshift and S3.
- Optimize data models and storage for cost-efficiency and performance.
- Write advanced SQL queries to support complex data analysis and reporting requirements.
- Collaborate with stakeholders to understand data requirements and translate them into scalable solutions.
- Ensure high data quality and integrity across platforms and processes.
- Implement CI/CD pipelines and best practices for infrastructure as code using CloudFormation or similar tools.
Required Skills & Experience:
- Strong hands-on experience with Python or PySpark for data processing.
- Deep knowledge of AWS Glue, Athena, Lambda, Redshift, RDS, DMS, and CloudFormation.
- Proficiency in writing complex SQL queries and optimizing them for performance.
- Familiarity with serverless architectures and AWS best practices.
- Experience in designing and maintaining robust data architectures and data lakes.
- Ability to troubleshoot and resolve data pipeline issues efficiently.
- Strong communication and stakeholder management skills.
Join NoBrokerHood as a Field Auditor – interact with residents, visit societies, and collect genuine feedback to help us improve our services!
✅ Good communication (English + regional language)
✅ Graduate | Age below 33
✅ DL + Bike (mandatory for male candidates)
✅ Field job | 6 days/week | Weekend availability required
Position: Senior Analyst QA (Automation Testing)
Experience: 4-6 years
Years Location: Bengaluru
We are a multi-award-winning creative engineering company offering design and technology solutions onmobile, web and cloud platforms. We are looking for an enthusiastic and self-driven Test Engineer to join ourteam.
Roles and Responsibilities:
• Responsible fortesting Web Application.
• Responsible for Automation testing: Design, develop and execute automation scripts using open-source tools like Appium, Selenium and QMetry.
• Responsible for writing clear, concise, and comprehensive test plans and test cases.
• Handle ClientInteractions
Desired Profile:
• Extensive experience in Automation Testing of web application
• Experience in Manual Testing
• Hands-on testing experience on Selenium, Appium, QMetry.
• Strong knowledge ofsoftware QA methodologies, tools, and processes • Experience with performance and/orsecurity testing is a plus.
• Excellent Communication Skills & Client Interactions.
Good to Have:
• Experience in manual testing
• KPI driven
• Should be a team player
• Independently evaluates issues and proposes solutions
• Raises issues/concerns on time
• Good if he/she has experience testing AI/ML model
Qualifications:
MCA, BE/B.Tech - All Regular - 65% Above
Job Details:
- Three plus years of experience in IT systems analysis and application program development
- Experience in an Object Oriented Programming language such as Java is a must
- The candidate must be a hands-on coder doing active development
- Extensive experience programming in Java Application Development & RESTful API
- Experience in AWS Cloud technologies is optional but good to have
- Proficient in Object-Oriented Programming (OOP) concepts, workflows, and design patterns
Education:
Bachelor’s/ Master’s degree in related field with 2+ years of experience in ERP product sales. .
Skills
- Excellent communication (verbal and written), time management skills, fast learner, self‐ motivated, and comfortable taking initiative and handling multiple projects simultaneously
- Excellent customer approach, interpersonal and influencing skills
- Proficient and demonstrable experience in prospecting, qualifying, creating value-based demonstrations
- A sales person with a proven, successful background in sales of 2+ years of experience in ERP sales.
Job Responsibilities:
- Responsible for New Business Development via prospecting, qualifying, selling and closing software product sales in the Enterprise Division.
- Develop and implement strategy sales in the particular region to maximize growth opportunities, strengthen market share and maximum customer retention.
- Attracting new clients by innovating and overseeing the sales process for the business.
- Responsible for enhancing revenue, within existing and new clients, through continuous client engagement
- Set up meetings with potential clients and listen to their wishes and concerns, Improving sales strategy.
- Research and identify new market opportunities.
- Creates and conducts effective presentation and product demos
- Build & Strengthen market intelligence & sales analytics for identification of opportunities, effective client solutioning and deal conversion.
- Consistently meet and exceed sales targets
- Foster a collaborative environment within the organization.
- Identify trends and customer needs, building a short/medium/long-term sales pipeline in accordance with targets
- Prepare and deliver pitches to potential investors.
- Create and conduct proposal and presentation based on clients’ needs and closing of orders
- Be responsible for conduct webinar about our product.
- Provide timely weekly, monthly and quarterly sales reports
- Client Relationship management
- 4-10 years experience
- Strong React, TypeScript, JavaScript
- Experience with NextJS and Material UI
- Experience with popular React.js workflows (such as Flux or Redux)
- Demonstrated mastery of HTML, CSS & JavaScript (ES6+)
- Good understanding of HTML, CSS, Javascript, jQuery, Bootstrap3/4, JSON & AJAX.
- Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object model
- Experience in Graph implementation
- Thorough understanding of React.js and its core principles; Familiarity with RESTful APIs.
- Knowledge of modern authorization mechanisms, such as JSON Web Token
- Experience with common front-end development tools such as Babel, Webpack, NPM, etc.
- Experience in Blob data implementation with Javascript
- Familiarity with RESTful APIs
- A knack for benchmarking and optimization
- Familiarity with code versioning tools
ABOUT THE JOB
We are looking for a Senior Software Engineer to join our team. We believe in giving engineers responsibility, not tasks. Our goal is to motivate and challenge people to do their best work. To do that, we have a very fluid structure and give people flexibility to work on projects that they enjoy the most. This develops more capable engineers, and keeps everyone engaged and happy.
Responsibilities
- Design, develop, test, deploy, maintain and improve the software.
- Manage individual projects priorities, deadlines and deliverables with your technical expertise.
- Identify and solve for bottlenecks within our software stack.
ABOUT YOU
Rubrik Software Engineers are self-starters, driven, and can manage themselves. Bottom line, if you have a limitless drive and like to win, we want to talk to you - come make history!
- Bachelor’s or Master’s degree or equivalent in computer science or related field
- 8+ years of relevant work experience,
- Proficiency in one or more general purpose programming languages like Java, C/C++, Scala, Python
- Experience with Google Cloud Platform/AWS/Azure or other public cloud technologies is a plus
- Experience working with two or more from the following: Unix/Linux environments, Windows environments, distributed systems, networking, developing large software systems, file systems, storage systems, hypervisors, databases and/or security software development.
ABOUT THE TEAM
Galactus team owns the end to end development of Rubrik’s data management suite for commercial public clouds - AWS, Azure and GCP. Our objective is to bring the simplicity of Rubrik’s on-prem data protection and management offerings to our customers in the cloud through a solution designed from ground up to be highly scalable, available & secure and yet optimized to minimize our customer’s cloud costs. We achieve this by taking a cloud-first approach to design - leveraging technologies built for the scale, elasticity and automation needs of the cloud; and deploying on our brand new SaaS platform called Polaris.
Recently we have :-
- Built policy based backup and recovery for Virtual Machines in AWS, Azure and GCP and managed databases in AWS.
- Built features like granular file recovery leveraging managed Kubernetes Service in AWS for elastic compute
ABOUT RUBRIK
Rubrik is one of the fastest-growing companies in Silicon Valley, revolutionising data protection, and management in the emerging multi-cloud world. We are the leader in cloud data management, delivering a single platform to manage and protect data in the cloud, at the edge, and on-premises. Enterprises choose Rubrik to simplify backup and recovery, accelerate cloud adoption, enable automation at scale, and secure against cyberthreats. We’ve been recognized as a Forbes Cloud 100 Company two years in a row and as a LinkedIn Top 10 startup.
Rubrik provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability or genetics. In addition to federal law requirements, Rubrik complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.
Role and Responsibilities
- Build a low latency serving layer that powers DataWeave's Dashboards, Reports, and Analytics functionality
- Build robust RESTful APIs that serve data and insights to DataWeave and other products
- Design user interaction workflows on our products and integrating them with data APIs
- Help stabilize and scale our existing systems. Help design the next generation systems.
- Scale our back end data and analytics pipeline to handle increasingly large amounts of data.
- Work closely with the Head of Products and UX designers to understand the product vision and design philosophy
- Lead/be a part of all major tech decisions. Bring in best practices. Mentor younger team members and interns.
- Constantly think scale, think automation. Measure everything. Optimize proactively.
- Be a tech thought leader. Add passion and vibrance to the team. Push the envelope.
Skills and Requirements
- 8- 15 years of experience building and scaling APIs and web applications.
- Experience building and managing large scale data/analytics systems.
- Have a strong grasp of CS fundamentals and excellent problem solving abilities. Have a good understanding of software design principles and architectural best practices.
- Be passionate about writing code and have experience coding in multiple languages, including at least one scripting language, preferably Python.
- Be able to argue convincingly why feature X of language Y rocks/sucks, or why a certain design decision is right/wrong, and so on.
- Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
- Have experience working with multiple storage and indexing technologies such as MySQL, Redis, MongoDB, Cassandra, Elastic.
- Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ.
- Use the command line like a pro. Be proficient in Git and other essential software development tools.
- Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
- Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana, Graylog, StatsD, Datadog etc.
- Working knowledge of building websites and apps. Good understanding of integration complexities and dependencies.
- Working knowledge linux server administration as well as the AWS ecosystem is desirable.
- It's a huge bonus if you have some personal projects (including open source contributions) that you work on during your spare time. Show off some of your projects you have hosted on GitHub.





