Cutshort logo
Pipeline management Jobs in Bangalore (Bengaluru)

6+ Pipeline management Jobs in Bangalore (Bengaluru) | Pipeline management Job openings in Bangalore (Bengaluru)

Apply to 6+ Pipeline management Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Pipeline management Job opportunities across top companies like Google, Amazon & Adobe.

icon
SMS Magic
Agency job
via Merito by Jinita Sumaria
Remote, Pune, Bengaluru (Bangalore)
8 - 12 yrs
₹25L - ₹30L / yr
Demand generation
Email Marketing
Pipeline management

Our client is the industry-leading provider of AI Assisted Conversational messaging solutions. They help Professionals and Institutes such as Doctors, Lawyers and Hospitals and Education Institutes drive consumer experience over text messaging channels like SMS/Whatsapp in their Enquiry management and Customer support processes. As a forward- thinking global company, it continues to innovate and develop cutting edge technologies like Conversational AI, Chatbots, and Omni channel solutions that redefine how businesses digitally communicate with their customers.


They integrate with Top CRMs like Salesforce,

Zoho, Hub spot among others to drive engagement in key moments, and have acquired 5000

customers across SMB’s and Mid-market (from small Professional Doctors and lawyers to large global staffing companies and large insurance companies).They’re growing at a fast pace and need a sharp, focused, self-starter person to join their marketing team. As a Demand Generation Manager, you will work closely with cross-functional teams, including marketing, product & customer success to own the pipeline.


Requirements



  • Own, develop and execute end-to-end campaigns that engage and convert prospective buyers with a focus on developers and marketers within specific verticals.
  • Create and oversee impactful and engaging content, targeted competition displacement campaigns and ABM campaigns.
  • Work closely with product development, product marketing & customer success teams, to design and implement campaigns that deliver on business objectives.
  • Design & execute campaigns that drive pipeline and generate opportunities using PLG motion.
  • Understand the buyer’s journey at each stage of the Sales funnel and translate messaging and positioning into effective campaigns.
  • Keep track of MQL-SQL-SAL conversion rates, analyze campaign performance, surface insights and provide recommendations for optimizing results.
  • Track program results, measure program success and report metrics to stakeholders
  • Build strong relationships with key cross functional stakeholders across GTM organization to ensure campaign enablement and engagement.
  • Develop scalable, repeatable campaign playbooks
  • Guide the creation of content alongside our Content Marketing, Product Marketing and Solutions teams
  • Develop outbound and account-based campaigns that complement inbound campaign strategy


What we need:

  • 10+ years of work experience, with at least 7 years in demand generation
  • Graduate/masters in any stream with good understanding of SaaS product space
  • Proven track record of leading multi-channel campaigns with success, collaborating with stakeholders and finding opportunities to strategically up level marketing efforts
  • Understanding of the nuances of running an integrated campaign, and ability to think both strategically and execute at an operational level with ease
  • Agile, Nimble, and energetic
  • Being an Individual Contributor, your performance will be measured by the pipeline you generate, your ability to execute and achieve measurable results including new logo development, and owning cross-sell campaigns to drive expansion opportunities



Read more
Red.Health

at Red.Health

2 candid answers
Mayur Bellapu
Posted by Mayur Bellapu
Bengaluru (Bangalore)
3 - 6 yrs
₹15L - ₹30L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+5 more

Job Description: Data Engineer

We are looking for a curious Data Engineer to join our extremely fast-growing Tech Team at StanPlus

 

About RED.Health (Formerly Stanplus Technologies)

Get to know the team:

Join our team and help us build the world’s fastest and most reliable emergency response system using cutting-edge technology.

Because every second counts in an emergency, we are building systems and flows with 4 9s of reliability to ensure that our technology is always there when people need it the most. We are looking for distributed systems experts who can help us perfect the architecture behind our key design principles: scalability, reliability, programmability, and resiliency. Our system features a powerful dispatch engine that connects emergency service providers with patients in real-time

.

Key Responsibilities

●     Build Data ETL Pipelines

●     Develop data set processes

●     Strong analytic skills related to working with unstructured datasets

●     Evaluate business needs and objectives

●     Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery

●     Interpret trends and patterns

●     Work with data and analytics experts to strive for greater functionality in our data system

●     Build algorithms and prototypes

●     Explore ways to enhance data quality and reliability

●     Work with the Executive, Product, Data, and D   esign teams, to assist with data-related technical issues and support their data infrastructure needs.

●     Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.

 

Key Requirements

●     Proven experience as a data engineer, software developer, or similar of at least 3 years.

●     Bachelor's / Master’s degree in data engineering, big data analytics, computer engineering, or related field.

●     Experience with big data tools: Hadoop, Spark, Kafka, etc.

●     Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.

●     Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.

●     Experience with Azure, AWS cloud services: EC2, EMR, RDS, Redshift

●     Experience with BigQuery

●     Experience with stream-processing systems: Storm, Spark-Streaming, etc.

●     Experience with languages: Python, Java, C++, Scala, SQL, R, etc.

●     Good hands-on with Hive, Presto.

 


Read more
Bengaluru (Bangalore)
4 - 10 yrs
₹15L - ₹22L / yr
SQL Azure
ADF
Business process management
Windows Azure
SQL
+12 more

Desired Competencies:

 

Ø  Expertise in Azure Data Factory V2

Ø  Expertise in other Azure components like Data lake Store, SQL Database, Databricks

Ø  Must have working knowledge of spark programming

Ø  Good exposure to Data Projects dealing with Data Design and Source to Target documentation including defining transformation rules

Ø  Strong knowledge of CICD Process

Ø  Experience in building power BI reports

Ø  Understanding of different components like Pipelines, activities, datasets & linked services

Ø  Exposure to dynamic configuration of pipelines using data sets and linked Services

Ø  Experience in designing, developing and deploying pipelines to higher environments

Ø  Good knowledge on File formats for flexible usage, File location Objects (SFTP, FTP, local, HDFS, ADLS, BLOB, Amazon S3 etc.)

Ø  Strong knowledge in SQL queries

Ø  Must have worked in full life-cycle development from functional design to deployment

Ø  Should have working knowledge of GIT, SVN

Ø  Good experience in establishing connection with heterogeneous sources like Hadoop, Hive, Amazon, Azure, Salesforce, SAP, HANA, API’s, various Databases etc.

Ø  Should have working knowledge of different resources available in Azure like Storage Account, Synapse, Azure SQL Server, Azure Data Bricks, Azure Purview

Ø  Any experience related to metadata management, data modelling, and related tools (Erwin or ER Studio or others) would be preferred

 

Preferred Qualifications:

Ø  Bachelor's degree in Computer Science or Technology

Ø  Proven success in contributing to a team-oriented environment

Ø  Proven ability to work creatively and analytically in a problem-solving environment

Ø  Excellent communication (written and oral) and interpersonal skills

Qualifications

BE/BTECH

KEY RESPONSIBILITIES :

You will join a team designing and building a data warehouse covering both relational and dimensional models, developing reports, data marts and other extracts and delivering these via SSIS, SSRS, SSAS, and PowerBI. It is seen as playing a vital role in delivering a single version of the truth on Client’s data and delivering MI & BI that will feature in enabling both operational and strategic decision making.

You will be able to take responsibility for projects over the entire software lifecycle and work with minimum supervision. This would include technical analysis, design, development, and test support as well as managing the delivery to production.

The initial project being resourced is around the development and implementation of a Data Warehouse and associated MI/BI functions.

 

Principal Activities:

1.       Interpret written business requirements documents

2.       Specify (High Level Design and Tech Spec), code and write automated unit tests for new aspects of MI/BI Service.

3.       Write clear and concise supporting documentation for deliverable items.

4.       Become a member of the skilled development team willing to contribute and share experiences and learn as appropriate.

5.       Review and contribute to requirements documentation.

6.       Provide third line support for internally developed software.

7.       Create and maintain continuous deployment pipelines.

8.       Help maintain Development Team standards and principles.

9.       Contribute and share learning and experiences with the greater Development team.

10.   Work within the company’s approved processes, including design and service transition.

11.   Collaborate with other teams and departments across the firm.

12.   Be willing to travel to other offices when required.
13.You agree to comply with any reasonable instructions or regulations issued by the Company from time to time including those set out in the terms of the dealing and other manuals, including staff handbooks and all other group policies


Location
– Bangalore

 

Read more
Bengaluru (Bangalore)
3 - 7 yrs
₹3L - ₹20L / yr
DevOps
Windows Azure
Linux/Unix
Microsoft Windows Azure
SQL Azure
+5 more
  • 4+ years of experience in IT and infrastructure

  • 2+ years of experience in Azure Devops

  • Experience with Azure DevOps using both as CI / CD tool and Agile framework

  • Practical experience building and maintaining automated operational infrastructure

  • Experience in building React or Angular applications, .NET is must.

  • Practical experience using version control systems with Azure Repo

  • Developed and maintained scripts using Power Shell, ARM templates/ Terraform scripts for Infrastructure as a Code.

  • Experience in Linux shell scripting (Ubuntu) is must

  • Hands on experience with release automation, configuration and debugging.

  • Should have good knowledge of branching and merging

  • Integration of tools like static code analysis tools like SonarCube and Snky or static code analyser tools is a must.

Read more
Curl Analytics
Agency job
via wrackle by Naveen Taalanki
Bengaluru (Bangalore)
5 - 10 yrs
₹15L - ₹30L / yr
ETL
Big Data
Data engineering
Apache Kafka
PySpark
+11 more
What you will do
  • Bring in industry best practices around creating and maintaining robust data pipelines for complex data projects with/without AI component
    • programmatically ingesting data from several static and real-time sources (incl. web scraping)
    • rendering results through dynamic interfaces incl. web / mobile / dashboard with the ability to log usage and granular user feedbacks
    • performance tuning and optimal implementation of complex Python scripts (using SPARK), SQL (using stored procedures, HIVE), and NoSQL queries in a production environment
  • Industrialize ML / DL solutions and deploy and manage production services; proactively handle data issues arising on live apps
  • Perform ETL on large and complex datasets for AI applications - work closely with data scientists on performance optimization of large-scale ML/DL model training
  • Build data tools to facilitate fast data cleaning and statistical analysis
  • Ensure data architecture is secure and compliant
  • Resolve issues escalated from Business and Functional areas on data quality, accuracy, and availability
  • Work closely with APAC CDO and coordinate with a fully decentralized team across different locations in APAC and global HQ (Paris).

You should be

  •  Expert in structured and unstructured data in traditional and Big data environments – Oracle / SQLserver, MongoDB, Hive / Pig, BigQuery, and Spark
  • Have excellent knowledge of Python programming both in traditional and distributed models (PySpark)
  • Expert in shell scripting and writing schedulers
  • Hands-on experience with Cloud - deploying complex data solutions in hybrid cloud / on-premise environment both for data extraction/storage and computation
  • Hands-on experience in deploying production apps using large volumes of data with state-of-the-art technologies like Dockers, Kubernetes, and Kafka
  • Strong knowledge of data security best practices
  • 5+ years experience in a data engineering role
  • Science / Engineering graduate from a Tier-1 university in the country
  • And most importantly, you must be a passionate coder who really cares about building apps that can help people do things better, smarter, and faster even when they sleep
Read more
Bengaluru (Bangalore)
1 - 5 yrs
₹12L - ₹25L / yr
DevOps
skill iconDocker
skill iconKubernetes
Ansible
skill iconRuby on Rails (ROR)
+6 more

Role : DevOps Engineer 

Experience : 1 to 2 years and 2 to 5 Years as DevOps Engineer (2 Positions)

Location : Bangalore. 5 Days Working.

Education Qualification : Tech/B.E for Tier-1/Tier-2/Tier-3 Colleges or equivalent institutes 

 

Skills :- DevOps Engineering, Ruby On Rails or Python and Bash/Shell skills, Docker, rkt or similar container engine, Kubernetes or similar clustering solutions


As DevOps Engineer, you'll be part of the team building the stage for our Software Engineers to  work on, helping to enhance our product performance and reliability. 

Responsibilities: 

  • Build & operate infrastructure to support website, backed cluster, ML projects in the  organization. 
  • Helping teams become more autonomous and allowing the Operation team to focus on  improving the infrastructure and optimizing processes. 
  • Delivering system management tooling to the engineering teams. 
  • Working on your own applications which will be used internally. 
  • Contributing to open source projects that we are using (or that we may start). 
  • Be an advocate for engineering best practices in and out of the company. 
  • Organizing tech talks and participating in meetups and representing Box8 at industry  events. 
  • Sharing pager duty for the rare instances of something serious happening. ∙ Collaborate with other developers to understand & setup tooling needed for Continuous  Integration/Delivery/Deployment (CI/CD) practices. 

Requirements: 

  • 1+ Years Of Industry Experience Scale existing back end systems to handle ever increasing  amounts of traffic and new product requirements. 
  • Ruby On Rails or Python and Bash/Shell skills. 
  • Experience managing complex systems at scale. 
  • Experience with Docker, rkt or similar container engine. 
  • Experience with Kubernetes or similar clustering solutions. 
  • Experience with tools such as Ansible or Chef Understanding of the importance of smart  metrics and alerting. 
  • Hands on experience with cloud infrastructure provisioning, deployment, monitoring (we are  on AWS and use ECS, ELB, EC2, Elasticache, Elasticsearch, S3, CloudWatch). 
  • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
  • Knowledge of data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • Experience in working on linux based servers. 
  • Managing large scale production grade infrastructure on AWS Cloud. 
  • Good Knowledge on scripting languages like ruby, python or bash. 
  • Experience in creating in deployment pipeline from scratch. 
  • Expertise in any of the CI tools, preferably Jenkins. 
  • Good knowledge of docker containers and its usage. 
  • Using Infra/App Monitoring tools like, CloudWatch/Newrelic/Sensu. 

Good to have: 

  • Knowledge of Ruby on Rails based applications and its deployment methodologies.
  • Experience working on Container Orchestration tools like Kubernetes/ECS/Mesos.
  • Extra Points For Experience With Front-end development NewRelic GCP Kafka,  Elasticsearch.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort