5+ Pipeline management Jobs in Bangalore (Bengaluru) | Pipeline management Job openings in Bangalore (Bengaluru)
Apply to 5+ Pipeline management Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Pipeline management Job opportunities across top companies like Google, Amazon & Adobe.
Our client is the industry-leading provider of AI Assisted Conversational messaging solutions. They help Professionals and Institutes such as Doctors, Lawyers and Hospitals and Education Institutes drive consumer experience over text messaging channels like SMS/Whatsapp in their Enquiry management and Customer support processes. As a forward- thinking global company, it continues to innovate and develop cutting edge technologies like Conversational AI, Chatbots, and Omni channel solutions that redefine how businesses digitally communicate with their customers.
They integrate with Top CRMs like Salesforce,
Zoho, Hub spot among others to drive engagement in key moments, and have acquired 5000
customers across SMB’s and Mid-market (from small Professional Doctors and lawyers to large global staffing companies and large insurance companies).They’re growing at a fast pace and need a sharp, focused, self-starter person to join their marketing team. As a Demand Generation Manager, you will work closely with cross-functional teams, including marketing, product & customer success to own the pipeline.
Requirements
- Own, develop and execute end-to-end campaigns that engage and convert prospective buyers with a focus on developers and marketers within specific verticals.
- Create and oversee impactful and engaging content, targeted competition displacement campaigns and ABM campaigns.
- Work closely with product development, product marketing & customer success teams, to design and implement campaigns that deliver on business objectives.
- Design & execute campaigns that drive pipeline and generate opportunities using PLG motion.
- Understand the buyer’s journey at each stage of the Sales funnel and translate messaging and positioning into effective campaigns.
- Keep track of MQL-SQL-SAL conversion rates, analyze campaign performance, surface insights and provide recommendations for optimizing results.
- Track program results, measure program success and report metrics to stakeholders
- Build strong relationships with key cross functional stakeholders across GTM organization to ensure campaign enablement and engagement.
- Develop scalable, repeatable campaign playbooks
- Guide the creation of content alongside our Content Marketing, Product Marketing and Solutions teams
- Develop outbound and account-based campaigns that complement inbound campaign strategy
What we need:
- 10+ years of work experience, with at least 7 years in demand generation
- Graduate/masters in any stream with good understanding of SaaS product space
- Proven track record of leading multi-channel campaigns with success, collaborating with stakeholders and finding opportunities to strategically up level marketing efforts
- Understanding of the nuances of running an integrated campaign, and ability to think both strategically and execute at an operational level with ease
- Agile, Nimble, and energetic
- Being an Individual Contributor, your performance will be measured by the pipeline you generate, your ability to execute and achieve measurable results including new logo development, and owning cross-sell campaigns to drive expansion opportunities
a global provider of Business Process Management company
Desired Competencies:
Ø Expertise in Azure Data Factory V2
Ø Expertise in other Azure components like Data lake Store, SQL Database, Databricks
Ø Must have working knowledge of spark programming
Ø Good exposure to Data Projects dealing with Data Design and Source to Target documentation including defining transformation rules
Ø Strong knowledge of CICD Process
Ø Experience in building power BI reports
Ø Understanding of different components like Pipelines, activities, datasets & linked services
Ø Exposure to dynamic configuration of pipelines using data sets and linked Services
Ø Experience in designing, developing and deploying pipelines to higher environments
Ø Good knowledge on File formats for flexible usage, File location Objects (SFTP, FTP, local, HDFS, ADLS, BLOB, Amazon S3 etc.)
Ø Strong knowledge in SQL queries
Ø Must have worked in full life-cycle development from functional design to deployment
Ø Should have working knowledge of GIT, SVN
Ø Good experience in establishing connection with heterogeneous sources like Hadoop, Hive, Amazon, Azure, Salesforce, SAP, HANA, API’s, various Databases etc.
Ø Should have working knowledge of different resources available in Azure like Storage Account, Synapse, Azure SQL Server, Azure Data Bricks, Azure Purview
Ø Any experience related to metadata management, data modelling, and related tools (Erwin or ER Studio or others) would be preferred
Preferred Qualifications:
Ø Bachelor's degree in Computer Science or Technology
Ø Proven success in contributing to a team-oriented environment
Ø Proven ability to work creatively and analytically in a problem-solving environment
Ø Excellent communication (written and oral) and interpersonal skills
Qualifications
BE/BTECH
KEY RESPONSIBILITIES :
You will join a team designing and building a data warehouse covering both relational and dimensional models, developing reports, data marts and other extracts and delivering these via SSIS, SSRS, SSAS, and PowerBI. It is seen as playing a vital role in delivering a single version of the truth on Client’s data and delivering MI & BI that will feature in enabling both operational and strategic decision making. You will be able to take responsibility for projects over the entire software lifecycle and work with minimum supervision. This would include technical analysis, design, development, and test support as well as managing the delivery to production. The initial project being resourced is around the development and implementation of a Data Warehouse and associated MI/BI functions. |
Principal Activities: 1. Interpret written business requirements documents 2. Specify (High Level Design and Tech Spec), code and write automated unit tests for new aspects of MI/BI Service. 3. Write clear and concise supporting documentation for deliverable items. 4. Become a member of the skilled development team willing to contribute and share experiences and learn as appropriate. 5. Review and contribute to requirements documentation. 6. Provide third line support for internally developed software. 7. Create and maintain continuous deployment pipelines. 8. Help maintain Development Team standards and principles. 9. Contribute and share learning and experiences with the greater Development team. 10. Work within the company’s approved processes, including design and service transition. 11. Collaborate with other teams and departments across the firm. 12. Be willing to travel to other offices when required. |
Location – Bangalore
A Top level 5 Services Company
-
4+ years of experience in IT and infrastructure
-
2+ years of experience in Azure Devops
-
Experience with Azure DevOps using both as CI / CD tool and Agile framework
-
Practical experience building and maintaining automated operational infrastructure
-
Experience in building React or Angular applications, .NET is must.
-
Practical experience using version control systems with Azure Repo
-
Developed and maintained scripts using Power Shell, ARM templates/ Terraform scripts for Infrastructure as a Code.
-
Experience in Linux shell scripting (Ubuntu) is must
-
Hands on experience with release automation, configuration and debugging.
-
Should have good knowledge of branching and merging
-
Integration of tools like static code analysis tools like SonarCube and Snky or static code analyser tools is a must.
- Bring in industry best practices around creating and maintaining robust data pipelines for complex data projects with/without AI component
- programmatically ingesting data from several static and real-time sources (incl. web scraping)
- rendering results through dynamic interfaces incl. web / mobile / dashboard with the ability to log usage and granular user feedbacks
- performance tuning and optimal implementation of complex Python scripts (using SPARK), SQL (using stored procedures, HIVE), and NoSQL queries in a production environment
- Industrialize ML / DL solutions and deploy and manage production services; proactively handle data issues arising on live apps
- Perform ETL on large and complex datasets for AI applications - work closely with data scientists on performance optimization of large-scale ML/DL model training
- Build data tools to facilitate fast data cleaning and statistical analysis
- Ensure data architecture is secure and compliant
- Resolve issues escalated from Business and Functional areas on data quality, accuracy, and availability
- Work closely with APAC CDO and coordinate with a fully decentralized team across different locations in APAC and global HQ (Paris).
You should be
- Expert in structured and unstructured data in traditional and Big data environments – Oracle / SQLserver, MongoDB, Hive / Pig, BigQuery, and Spark
- Have excellent knowledge of Python programming both in traditional and distributed models (PySpark)
- Expert in shell scripting and writing schedulers
- Hands-on experience with Cloud - deploying complex data solutions in hybrid cloud / on-premise environment both for data extraction/storage and computation
- Hands-on experience in deploying production apps using large volumes of data with state-of-the-art technologies like Dockers, Kubernetes, and Kafka
- Strong knowledge of data security best practices
- 5+ years experience in a data engineering role
- Science / Engineering graduate from a Tier-1 university in the country
- And most importantly, you must be a passionate coder who really cares about building apps that can help people do things better, smarter, and faster even when they sleep
Role : DevOps Engineer
Experience : 1 to 2 years and 2 to 5 Years as DevOps Engineer (2 Positions)
Location : Bangalore. 5 Days Working.
Education Qualification : Tech/B.E for Tier-1/Tier-2/Tier-3 Colleges or equivalent institutes
Skills :- DevOps Engineering, Ruby On Rails or Python and Bash/Shell skills, Docker, rkt or similar container engine, Kubernetes or similar clustering solutions
As DevOps Engineer, you'll be part of the team building the stage for our Software Engineers to work on, helping to enhance our product performance and reliability.
Responsibilities:
- Build & operate infrastructure to support website, backed cluster, ML projects in the organization.
- Helping teams become more autonomous and allowing the Operation team to focus on improving the infrastructure and optimizing processes.
- Delivering system management tooling to the engineering teams.
- Working on your own applications which will be used internally.
- Contributing to open source projects that we are using (or that we may start).
- Be an advocate for engineering best practices in and out of the company.
- Organizing tech talks and participating in meetups and representing Box8 at industry events.
- Sharing pager duty for the rare instances of something serious happening. ∙ Collaborate with other developers to understand & setup tooling needed for Continuous Integration/Delivery/Deployment (CI/CD) practices.
Requirements:
- 1+ Years Of Industry Experience Scale existing back end systems to handle ever increasing amounts of traffic and new product requirements.
- Ruby On Rails or Python and Bash/Shell skills.
- Experience managing complex systems at scale.
- Experience with Docker, rkt or similar container engine.
- Experience with Kubernetes or similar clustering solutions.
- Experience with tools such as Ansible or Chef Understanding of the importance of smart metrics and alerting.
- Hands on experience with cloud infrastructure provisioning, deployment, monitoring (we are on AWS and use ECS, ELB, EC2, Elasticache, Elasticsearch, S3, CloudWatch).
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Knowledge of data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience in working on linux based servers.
- Managing large scale production grade infrastructure on AWS Cloud.
- Good Knowledge on scripting languages like ruby, python or bash.
- Experience in creating in deployment pipeline from scratch.
- Expertise in any of the CI tools, preferably Jenkins.
- Good knowledge of docker containers and its usage.
- Using Infra/App Monitoring tools like, CloudWatch/Newrelic/Sensu.
Good to have:
- Knowledge of Ruby on Rails based applications and its deployment methodologies.
- Experience working on Container Orchestration tools like Kubernetes/ECS/Mesos.
- Extra Points For Experience With Front-end development NewRelic GCP Kafka, Elasticsearch.