8+ Pipeline management Jobs in Bangalore (Bengaluru) | Pipeline management Job openings in Bangalore (Bengaluru)
Apply to 8+ Pipeline management Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Pipeline management Job opportunities across top companies like Google, Amazon & Adobe.
Responsibilities:
- Design, develop, and maintain efficient and reliable data pipelines.
- Identify and implement process improvements, automating manual tasks and optimizing data delivery.
- Build and maintain the infrastructure for data extraction, transformation, and loading (ETL) from diverse sources using SQL and AWS cloud technologies.
- Develop data tools and solutions to empower our analytics and data science teams, contributing to product innovation.
Qualifications:
Must Have:
- 2-3 years of experience in a Data Engineering role.
- Familiarity with data pipeline and workflow management tools (e.g., Airflow, Luigi, Azkaban).
- Experience with AWS cloud services.
- Working knowledge of object-oriented/functional scripting in Python
- Experience building and optimizing data pipelines and datasets.
- Strong analytical skills and experience working with structured and unstructured data.
- Understanding of data transformation, data structures, dimensional modeling, metadata management, schema evolution, and workload management.
- A passion for building high-quality, scalable data solutions.
Good to have:
- Experience with stream-processing systems (e.g., Spark Streaming, Flink).
- Working knowledge of message queuing, stream processing, and scalable data stores.
- Proficiency in SQL and experience with NoSQL databases like Elasticsearch and Cassandra/MongoDB.
Experience with big data tools such as HDFS/S3, Spark/Flink, Hive, HBase, Kafka/Kinesis.
ABOUT US
Learners Point Academy is one of Dubai's leading training institutes. With technology and automation transforming industries and creating new job opportunities, we aim for the evolution of professional training education in the MENA region. Guided by a strong purpose and desire, we help professionals unlock their true potential.
Job Description
Designation: International Voice Process UAE
Department: - Customer Care Executive
Location: Bangalore (Onsite)
Employment Type: Full-Time
Experience : 0-2 years
Shift Timings: Day Shift
About the Role
We are looking for driven, energetic, and results-oriented International voice process Associates to join our Team. This role focuses on re-engaging inactive leads, dormant learners, and past enquiries, converting them into successful enrollments for our Professional Training Programs
Certification programs.
If you enjoy meaningful customer conversations, understand buyer psychology, and thrive in a performance-driven sales environment, this role is ideal for you.
Interview Venue Adress: 3rd Floor, No. 460/454, Marathahalli - Sarjapur Outer Ring Rd, Teacher's Colony, Jakkasandra, 1st Block Koramangala, HSR Layout 5th Sector, Bengaluru, Karnataka 560034 Silk Board Service round Land Mark Blue Dart office
Carry updated CV walk -interview
Contact : M Partha Sarathy
Key Responsibilities
- Contact inactive leads through calls, Whats App, and emails to re-initiate interest.
- Re-engage past enquiries and dormant leads with relevant program offerings.
- Effectively pitch upskilling and certification programs based on customer needs.
- Maintain strong follow-ups and ensure high-quality closures.
CRM & Process Management
- Update CRM systems accurately with lead stages, notes, and follow-up actions.
- Maintain data hygiene and follow internal sales processes and compliance guidelines.
Required Skills & Qualifications
- Bachelor’s degree preferred (any discipline).
- 0–2 years of experience in inside sales, EdTech sales, tele-sales, or customer acquisition (preferred).
- Excellent communication skills – English mandatory
- Strong objection handling and persuasion skills.
- High persistence, resilience, and positive mindset.
- Target-driven, self-motivated, and disciplined.
- Comfortable working in a fast-paced, high-performance sales environment.
- Working knowledge of CRM tools is an added advantage.
Key Competencies
- Consultative selling
- Active listening
- Strong persuasion & influence
- Time management & discipline
- Customer psychology understanding
- Ability to create urgency without being pushy
- High resilience and self-motivation
Why Join Our Reactivation Team?
- Attractive fixed salary + high monthly incentives
- Performance-based earning potential with uncapped incentives
- Structured sales training & product enablement
- Exposure to high-quality warm leads
- Fast-track career growth in sales & leadership
- High-energy, collaborative sales culture
Who Should Apply?
If you are passionate about consultative selling, enjoy solving learner challenges, and thrive in target-driven roles, we’d love to hear from you.
Job Summary
The Healthcare Recruiter will manage the full recruitment lifecycle—from sourcing qualified candidates to onboarding newly hired staff. This role requires strong communication skills, knowledge of healthcare roles, and the ability to build relationships with hiring managers and candidates.
Key Responsibilities
- Source, screen, and recruit candidates for a variety of healthcare positions (RN, LPN, CNA, allied health, administrative, etc.)
- Partner with hiring managers to understand staffing needs and job requirements
- Post job openings across job boards, social media, and recruitment channels
- Conduct interviews and evaluate applicants’ qualifications and fit
- Manage applicant tracking system (ATS) and maintain accurate records
- Coordinate interviews, job offers, background checks, and onboarding
- Build and maintain a network of healthcare professionals
- Represent the company at career fairs, community events, and virtual recruiting sessions
- Ensure compliance with company policies, labor laws, and industry regulations
- Contribute to recruitment strategy improvements and workforce planning
Qualifications
- Bachelor’s degree in Human Resources, Business, Healthcare Management, or related field (preferred)
- 1–3 years of recruiting experience; healthcare recruiting strongly preferred
- Familiarity with healthcare roles, certifications, and licensure requirements
- Experience using ATS platforms and sourcing tools (Indeed, LinkedIn, etc.)
- Strong interpersonal, communication, and negotiation skills
- Ability to manage multiple requisitions in a fast-paced environment
Key Skills
- Talent sourcing & pipeline building
- Candidate relationship management
- Healthcare industry knowledge
- Interviewing & assessment
- Time management & organization
- Professional communication
- Attention to detail
Why Join Us?
- Competitive salary and bonus potential
- Health, dental, and vision insurance
- Paid time off and holidays
- Opportunities for career growth and professional development
- Supportive, mission-driven work environment
Our client is the industry-leading provider of AI Assisted Conversational messaging solutions. They help Professionals and Institutes such as Doctors, Lawyers and Hospitals and Education Institutes drive consumer experience over text messaging channels like SMS/Whatsapp in their Enquiry management and Customer support processes. As a forward- thinking global company, it continues to innovate and develop cutting edge technologies like Conversational AI, Chatbots, and Omni channel solutions that redefine how businesses digitally communicate with their customers.
They integrate with Top CRMs like Salesforce,
Zoho, Hub spot among others to drive engagement in key moments, and have acquired 5000
customers across SMB’s and Mid-market (from small Professional Doctors and lawyers to large global staffing companies and large insurance companies).They’re growing at a fast pace and need a sharp, focused, self-starter person to join their marketing team. As a Demand Generation Manager, you will work closely with cross-functional teams, including marketing, product & customer success to own the pipeline.
Requirements
- Own, develop and execute end-to-end campaigns that engage and convert prospective buyers with a focus on developers and marketers within specific verticals.
- Create and oversee impactful and engaging content, targeted competition displacement campaigns and ABM campaigns.
- Work closely with product development, product marketing & customer success teams, to design and implement campaigns that deliver on business objectives.
- Design & execute campaigns that drive pipeline and generate opportunities using PLG motion.
- Understand the buyer’s journey at each stage of the Sales funnel and translate messaging and positioning into effective campaigns.
- Keep track of MQL-SQL-SAL conversion rates, analyze campaign performance, surface insights and provide recommendations for optimizing results.
- Track program results, measure program success and report metrics to stakeholders
- Build strong relationships with key cross functional stakeholders across GTM organization to ensure campaign enablement and engagement.
- Develop scalable, repeatable campaign playbooks
- Guide the creation of content alongside our Content Marketing, Product Marketing and Solutions teams
- Develop outbound and account-based campaigns that complement inbound campaign strategy
What we need:
- 10+ years of work experience, with at least 7 years in demand generation
- Graduate/masters in any stream with good understanding of SaaS product space
- Proven track record of leading multi-channel campaigns with success, collaborating with stakeholders and finding opportunities to strategically up level marketing efforts
- Understanding of the nuances of running an integrated campaign, and ability to think both strategically and execute at an operational level with ease
- Agile, Nimble, and energetic
- Being an Individual Contributor, your performance will be measured by the pipeline you generate, your ability to execute and achieve measurable results including new logo development, and owning cross-sell campaigns to drive expansion opportunities
Desired Competencies:
Ø Expertise in Azure Data Factory V2
Ø Expertise in other Azure components like Data lake Store, SQL Database, Databricks
Ø Must have working knowledge of spark programming
Ø Good exposure to Data Projects dealing with Data Design and Source to Target documentation including defining transformation rules
Ø Strong knowledge of CICD Process
Ø Experience in building power BI reports
Ø Understanding of different components like Pipelines, activities, datasets & linked services
Ø Exposure to dynamic configuration of pipelines using data sets and linked Services
Ø Experience in designing, developing and deploying pipelines to higher environments
Ø Good knowledge on File formats for flexible usage, File location Objects (SFTP, FTP, local, HDFS, ADLS, BLOB, Amazon S3 etc.)
Ø Strong knowledge in SQL queries
Ø Must have worked in full life-cycle development from functional design to deployment
Ø Should have working knowledge of GIT, SVN
Ø Good experience in establishing connection with heterogeneous sources like Hadoop, Hive, Amazon, Azure, Salesforce, SAP, HANA, API’s, various Databases etc.
Ø Should have working knowledge of different resources available in Azure like Storage Account, Synapse, Azure SQL Server, Azure Data Bricks, Azure Purview
Ø Any experience related to metadata management, data modelling, and related tools (Erwin or ER Studio or others) would be preferred
Preferred Qualifications:
Ø Bachelor's degree in Computer Science or Technology
Ø Proven success in contributing to a team-oriented environment
Ø Proven ability to work creatively and analytically in a problem-solving environment
Ø Excellent communication (written and oral) and interpersonal skills
Qualifications
BE/BTECH
KEY RESPONSIBILITIES :
|
You will join a team designing and building a data warehouse covering both relational and dimensional models, developing reports, data marts and other extracts and delivering these via SSIS, SSRS, SSAS, and PowerBI. It is seen as playing a vital role in delivering a single version of the truth on Client’s data and delivering MI & BI that will feature in enabling both operational and strategic decision making. You will be able to take responsibility for projects over the entire software lifecycle and work with minimum supervision. This would include technical analysis, design, development, and test support as well as managing the delivery to production. The initial project being resourced is around the development and implementation of a Data Warehouse and associated MI/BI functions. |
|
Principal Activities: 1. Interpret written business requirements documents 2. Specify (High Level Design and Tech Spec), code and write automated unit tests for new aspects of MI/BI Service. 3. Write clear and concise supporting documentation for deliverable items. 4. Become a member of the skilled development team willing to contribute and share experiences and learn as appropriate. 5. Review and contribute to requirements documentation. 6. Provide third line support for internally developed software. 7. Create and maintain continuous deployment pipelines. 8. Help maintain Development Team standards and principles. 9. Contribute and share learning and experiences with the greater Development team. 10. Work within the company’s approved processes, including design and service transition. 11. Collaborate with other teams and departments across the firm. 12. Be willing to travel to other offices when required. |
Location – Bangalore
-
4+ years of experience in IT and infrastructure
-
2+ years of experience in Azure Devops
-
Experience with Azure DevOps using both as CI / CD tool and Agile framework
-
Practical experience building and maintaining automated operational infrastructure
-
Experience in building React or Angular applications, .NET is must.
-
Practical experience using version control systems with Azure Repo
-
Developed and maintained scripts using Power Shell, ARM templates/ Terraform scripts for Infrastructure as a Code.
-
Experience in Linux shell scripting (Ubuntu) is must
-
Hands on experience with release automation, configuration and debugging.
-
Should have good knowledge of branching and merging
-
Integration of tools like static code analysis tools like SonarCube and Snky or static code analyser tools is a must.
- Bring in industry best practices around creating and maintaining robust data pipelines for complex data projects with/without AI component
- programmatically ingesting data from several static and real-time sources (incl. web scraping)
- rendering results through dynamic interfaces incl. web / mobile / dashboard with the ability to log usage and granular user feedbacks
- performance tuning and optimal implementation of complex Python scripts (using SPARK), SQL (using stored procedures, HIVE), and NoSQL queries in a production environment
- Industrialize ML / DL solutions and deploy and manage production services; proactively handle data issues arising on live apps
- Perform ETL on large and complex datasets for AI applications - work closely with data scientists on performance optimization of large-scale ML/DL model training
- Build data tools to facilitate fast data cleaning and statistical analysis
- Ensure data architecture is secure and compliant
- Resolve issues escalated from Business and Functional areas on data quality, accuracy, and availability
- Work closely with APAC CDO and coordinate with a fully decentralized team across different locations in APAC and global HQ (Paris).
You should be
- Expert in structured and unstructured data in traditional and Big data environments – Oracle / SQLserver, MongoDB, Hive / Pig, BigQuery, and Spark
- Have excellent knowledge of Python programming both in traditional and distributed models (PySpark)
- Expert in shell scripting and writing schedulers
- Hands-on experience with Cloud - deploying complex data solutions in hybrid cloud / on-premise environment both for data extraction/storage and computation
- Hands-on experience in deploying production apps using large volumes of data with state-of-the-art technologies like Dockers, Kubernetes, and Kafka
- Strong knowledge of data security best practices
- 5+ years experience in a data engineering role
- Science / Engineering graduate from a Tier-1 university in the country
- And most importantly, you must be a passionate coder who really cares about building apps that can help people do things better, smarter, and faster even when they sleep
Role : DevOps Engineer
Experience : 1 to 2 years and 2 to 5 Years as DevOps Engineer (2 Positions)
Location : Bangalore. 5 Days Working.
Education Qualification : Tech/B.E for Tier-1/Tier-2/Tier-3 Colleges or equivalent institutes
Skills :- DevOps Engineering, Ruby On Rails or Python and Bash/Shell skills, Docker, rkt or similar container engine, Kubernetes or similar clustering solutions
As DevOps Engineer, you'll be part of the team building the stage for our Software Engineers to work on, helping to enhance our product performance and reliability.
Responsibilities:
- Build & operate infrastructure to support website, backed cluster, ML projects in the organization.
- Helping teams become more autonomous and allowing the Operation team to focus on improving the infrastructure and optimizing processes.
- Delivering system management tooling to the engineering teams.
- Working on your own applications which will be used internally.
- Contributing to open source projects that we are using (or that we may start).
- Be an advocate for engineering best practices in and out of the company.
- Organizing tech talks and participating in meetups and representing Box8 at industry events.
- Sharing pager duty for the rare instances of something serious happening. ∙ Collaborate with other developers to understand & setup tooling needed for Continuous Integration/Delivery/Deployment (CI/CD) practices.
Requirements:
- 1+ Years Of Industry Experience Scale existing back end systems to handle ever increasing amounts of traffic and new product requirements.
- Ruby On Rails or Python and Bash/Shell skills.
- Experience managing complex systems at scale.
- Experience with Docker, rkt or similar container engine.
- Experience with Kubernetes or similar clustering solutions.
- Experience with tools such as Ansible or Chef Understanding of the importance of smart metrics and alerting.
- Hands on experience with cloud infrastructure provisioning, deployment, monitoring (we are on AWS and use ECS, ELB, EC2, Elasticache, Elasticsearch, S3, CloudWatch).
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Knowledge of data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience in working on linux based servers.
- Managing large scale production grade infrastructure on AWS Cloud.
- Good Knowledge on scripting languages like ruby, python or bash.
- Experience in creating in deployment pipeline from scratch.
- Expertise in any of the CI tools, preferably Jenkins.
- Good knowledge of docker containers and its usage.
- Using Infra/App Monitoring tools like, CloudWatch/Newrelic/Sensu.
Good to have:
- Knowledge of Ruby on Rails based applications and its deployment methodologies.
- Experience working on Container Orchestration tools like Kubernetes/ECS/Mesos.
- Extra Points For Experience With Front-end development NewRelic GCP Kafka, Elasticsearch.




