
Experience: 9-12 years
Location: Bangalore
Job Description
Strong Experience across Applications Migration to Cloud, Cloud native Architecture, Amazon EKS, Serverless (Lambda).
Delivery of customer Cloud Strategies aligned with customers business objectives and with a focus on Cloud Migrations and App Modernization.
Design of clients Cloud solutions with a focus on AWS.
Undertake short-term delivery engagements related to cloud architecture with a specific focus on AWS and Cloud Migrations/Modernization.
Provide leadership in migration and modernization methodologies and techniques including mass application movements into the cloud.
Implementation of AWS within in large regulated enterprise environments.
Nurture Cloud computing expertise internally and externally to drive Cloud Adoption.
Work with designers and developers in the team to guide them through the solution implementation.
Participate in performing Proof of Concept (POC) for various upcoming technologies to fit in business requirement.
Similar jobs
The candidate should have extensive experience in designing and developing scalable data pipelines and real-time data processing solutions. As a key member of the team, the Senior Data Engineer will play a critical role in building end-to-end data workflows, supporting machine learning model deployment, and driving MLOps practices in a fast-paced, agile environment. Strong expertise in Apache Kafka, Apache Flink, AWS SageMaker, and Terraform is essential. Additional experience with infrastructure automation and CI/CD for ML models is a significant advantage.
Key Responsibilities
- Design, develop, and maintain high-performance ETL and real-time data pipelines using Apache Kafka and Apache Flink.
- Build scalable and automated MLOps pipelines for training, validation, and deployment of models using AWS SageMaker and associated services.
- Implement and manage Infrastructure as Code (IaC) using Terraform to provision and manage AWS environments.
- Collaborate with data scientists, ML engineers, and DevOps teams to streamline model deployment workflows and ensure reliable production delivery.
- Optimize data storage and retrieval strategies for large-scale structured and unstructured datasets.
- Develop data transformation logic and integrate data from various internal and external sources into data lakes and warehouses.
- Monitor, troubleshoot, and enhance performance of data systems in a cloud-native, fast-evolving production setup.
- Ensure adherence to data governance, privacy, and security standards across all data handling activities.
- Document data engineering solutions and workflows to facilitate cross-functional understanding and ongoing maintenance.

Job Title: ServiceNow Analyst – CMDB and Discovery Specialist
Location: Gurgaon, Hybrid
Experience Required: 5+ Years
Job Type: [Full Time]
Job Summary:
We are seeking a highly skilled and experienced ServiceNow Analyst to join our team. The ideal candidate will be proficient in managing the Configuration Management Database (CMDB) within ServiceNow, with a strong background in ServiceNow Discovery and Service Mapping. This role involves identifying and resolving Discovery errors, ensuring the accuracy and integrity of the CMDB, and contributing to the overall efficiency of our IT service management processes.
This position is part of our ServiceNow ITOM (IT Operations Management) efforts, focused on Discovery, Service Mapping, and CMDB operations.
The ServiceNow CMDB and Discovery Administrator will be responsible for helping define, implement and integrate multiple discovery sources to support the configuration management database (CMDB).
Responsibilities:
CMDB Management:
- Maintain and manage the Configuration Management Database (CMDB) within ServiceNow.
- Ensure the accuracy and integrity of CMDB data, implementing best practices and standards.
- Regularly audit and reconcile CMDB data to maintain up-to-date and accurate information.
Discovery and Service Mapping:
- Work with support teams to ensure that the Discovery tool has the appropriate access and permissions to capture configuration information.
- Perform day-to-day administration of the ServiceNow Service Mapping/Discovery tool, including mapping additional defined business services into the tool and updating discovery configurations for deeper network penetration.
- Assist the CMDB Administrator with developing and maintaining the technical design of the ServiceNow CMDB to meet the current and anticipated scope and maintain CMDB integrity to effectively support other ITSM processes such as Incident Mgt., Change Mgt., and Asset Mgt.
- Implement data governance through identification rules and configuration of multiple discovery sources that populate the CMDB.
- Strong hands-on experience in optimizing discovery, tailoring patterns and development.
- Validating discovery results and troubleshooting as required.
- Assist the CMDB Administrator team with designing, deploying, and managing ServiceNow Service Mapping solutions for both on-premise and cloud resources, including schedules, credentials, and mid-servers.
- Define, create, and manage Service Mapping infrastructure, including patterns.
- Work with support teams to identify, investigate, and resolve Service Mapping issues, ensuring that Service Mapping is performing as expected.
- Assist in creating and maintaining Service Mapping-related documentation, aligning with other methods of populating the CMDB.
General/Admin:
- Assist with tasks related to implementing the ServiceNow application.
- Document procedures for the discovery, configuration management functions.
- Assist in developing, modifying, and configuring reports, dashboards, and outputs associated with the change, configuration, discovery, and mapping functions.
Qualifications:
- Over 5+ years of working experience in ServiceNow
- 3+ years' experience implementing, administering and optimizing ServiceNow Discovery and additional discovery tools
- Solid understanding of Configuration Management practices and principles
- Experience implementing ServiceNow Service Mapping
- ITIL foundations certification preferred
- Strong ability to effectively communicate (oral and written) with multiple levels of the organization (internally and externally)
- Effectively working under accelerated deadlines and working independently to complete objectives
- Experience working in a team-oriented, collaborative environment with good interpersonal skill
- Ability to translate business requirements into technical specifications and design
- Self-starter, positive attitude, and ability to problem-solve under minimal supervision
- Intelligent, motivated, and competitive with a “roll-up-the-sleeves” and “get the job done” attitude
- Ability to manage multiple simultaneous tasks within an open and interactive environment
Job Title: Salesforce Intelligence Cloud Specialist (Datorama)
Location:Mumbai, MH, India, Remote
Experience: 3-6 year in Salesforce Intelligence Cloud or Datorama
Job Summary:
We are looking for a skilled Salesforce Intelligence Cloud Specialist with extensive experience in Datorama to create and manage analytics dashboards for our clients. The role involves integrating data from various sources, including Google Analytics, Snowflake, and Salesforce Marketing Cloud, to provide actionable insights via Datorama.
Key Responsibilities:
- Develop and maintain Datorama dashboards to deliver comprehensive analytics reports for clients.
- Integrate and manage data from Google Analytics, Snowflake, and Salesforce Marketing Cloud within Datorama.
- Ensure data accuracy across all data sources and dashboards, troubleshooting discrepancies where necessary.
- Collaborate with internal teams to understand business requirements and translate them into data-driven insights.
- Use Datorama connectors, APIs, and custom data streams to automate data ingestion and reporting.
- Monitor dashboard performance and optimize data pipelines for efficiency and accuracy.
- Provide training and support to internal teams and clients on Datorama usage and data interpretation.
Required Skills and Qualifications:
- Hands-on experience with Salesforce Intelligence Cloud (Datorama), including dashboard creation and advanced reporting.
- Strong knowledge of Google Analytics, Snowflake, and Salesforce Marketing Cloud data integration with Datorama.
- Proficiency in Datorama connectors, APIs, and working with multiple data streams.
- Experience in data modeling, ETL processes, and best practices for data visualization and reporting.
- Strong analytical skills, with the ability to troubleshoot and ensure data integrity across multiple sources.
- Excellent communication skills for explaining complex technical information to non-technical stakeholders.
Preferred Qualifications:
- Familiarity with SQL, especially for querying databases such as Snowflake.
- Experience working with Salesforce Marketing Cloud data and building custom solutions.
- Salesforce certifications related to Datorama or Salesforce Intelligence Cloud are a plus.
Education:
- Bachelor’s degree in computer science, Data Analytics, Information Systems, or a related field.
Mail updated resume with salary detil-
Email: jobs[at]glansolutions[dot]com
Satish- 8851O18162
- 3+ years of experience in the Development in JAVA technology.
- AWS or cloud Experience
- Strong Java Basics
- SpringBoot or Spring MVC
- Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
- Proficient in REST API development
- Messaging Queue (RabitMQ or Kafka)
- Microservices
- Any Caching Mechanism
- Good at problem-solving
Good to Have Skills:
- 3+ years of experience in using Java/J2EE tech stacks
- Good understanding of data structures and algorithms.
- Excellent analytical and problem solving skills.
- Ability to work in a fast paced internet start-up environment.
- Experience in technical mentorship/coaching is highly desirable.
- Understanding of AI/ML algorithms is a plus.
As a Demand Generation Lead, you will play a critical role in driving revenue growth through strategic demand generation campaigns and initiatives. You will be responsible for executing integrated marketing campaigns, optimizing lead generation efforts, and collaborating with cross-functional teams to achieve business objectives.
Roles and Responsibilities:
- Develop and implement comprehensive demand-generation strategies that align with our business objectives and target audience. Identify target markets, segments, and personas to guide campaign planning and execution.
- Planned, create, and execute integrated marketing campaigns across various channels, including email marketing, digital advertising, content marketing, social media, and events.
- Optimize lead generation efforts through the effective use of marketing automation platforms, lead capture forms, landing pages, and lead nurturing workflows.
- Collaborate with the content team to develop high-quality content assets, such as whitepapers, case studies, blog posts, and videos, that support demand generation efforts.
- Utilize marketing analytics tools to track, analyze, and report on campaign performance, including metrics like conversion rates, ROI, and pipeline generation.
- Work closely with cross-functional teams, including sales, product marketing, design, and PR to align demand generation initiatives with overall marketing and sales strategies.
- Stay informed about market trends, competitor activities, and industry best practices related to demand generation and B2B marketing.
Requirements:
- Bachelor’s degree in marketing, business, or a related field.
- Proven experience in demand generation, digital marketing, or a related field within the B2B SaaS industry.
- Strong knowledge of demand generation tactics and channels, including email marketing, digital advertising, content marketing, social media, and events.
- Proficiency in marketing automation platforms, such as HubSpot, Marketo, or Pardot, and CRM systems like Salesforce.
- Excellent analytical skills with the ability to interpret data, track key metrics, and make data-driven decisions to optimize campaign performance.
- Exceptional project management skills with the ability to manage multiple campaigns and priorities simultaneously.
- Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders.
- Strategic mindset with a focus on driving results, meeting targets, and contributing to revenue growth
As a SaaS DevOps Engineer, you will be responsible for providing automated tooling and process enhancements for SaaS deployment, application and infrastructure upgrades and production monitoring.
-
Development of automation scripts and pipelines for deployment and monitoring of new production environments.
-
Development of automation scripts for upgrades, hotfixes deployments and maintenance.
-
Work closely with Scrum teams and product groups to support the quality and growth of the SaaS services.
-
Collaborate closely with SaaS Operations team to handle day-to-day production activities - handling alerts and incidents.
-
Assist SaaS Operations team to handle customers focus projects: migrations, features enablement.
-
Write Knowledge articles to document known issues and best practices.
-
Conduct regression tests to validate solutions or workarounds.
-
Work in a globally distributed team.
What achievements should you have so far?
-
Bachelor's or master’s degree in Computer Science, Information Systems, or equivalent.
-
Experience with containerization, deployment, and operations.
-
Strong knowledge of CI/CD processes (Git, Jenkins, Pipelines).
-
Good experience with Linux systems and Shell scripting.
-
Basic cloud experience, preferably oriented on MS Azure.
-
Basic knowledge of containerized solutions (Helm, Kubernetes, Docker).
-
Good Networking skills and experience.
-
Having Terraform or CloudFormation knowledge will be considered a plus.
-
Ability to analyze a task from a system perspective.
-
Excellent problem-solving and troubleshooting skills.
-
Excellent written and verbal communication skills; mastery in English and local language.
-
Must be organized, thorough, autonomous, committed, flexible, customer-focused and productive.
What you will do:
- Leveraging your deep knowledge to provide technical leadership to take projects from zero to completion
- Architecting, building and maintaining scalable data pipelines and accessing patterns related to permissions and security
- Researching, evaluating and utilising new technologies/tools/frameworks centred around high-volume data processing
- Involving in building and deploying large scale data processing pipelines in a production environment
- Working with data scientists and other engineers to develop data pipelines for model development and productization
- Identifying gaps and implementing solutions for data security, quality and automation of processes
- Providing inputs on right tool options and model designs for use cases
- Identifying gaps and implementing solutions for data security, quality and automation of processes
- Designing scalable implementations of the models developed by our Data Scientists
Desired Candidate Profile
What you need to have:- 3+ years strong programming experience in PySpark and Python
- Knowledge in Python, SQL, Spark (Pyspark)
- Exposure to AWS/ Azure cloud tools and services like S3 Athena, Apache Nifi, Apache Airflow
- Analytical and problem-solving skills
- Knowledge in Scrum & code sharing Tech: Git, Jira
- Experience related to processing frameworks such as Spark, Spark Streaming, Hive, Sqoop, Kafka etc
- Deep understanding of measuring and ensuring data quality at scale and the required tooling to monitor and optimise the performance of our data pipelines
- Experience building data pipelines and data-centric applications using distributed storage platforms and shipping data production pipelines sourcing data from a diverse array of sources
Primary Responsibilities
- Design, architect and develop advanced software solutions in a cross functional Agile team supporting multiple projects and initiatives
- Collaborate with product owners and/or the business on requirements definition, development of functional specifications, and design
- Collaborate on or lead development of technical design and specifications as required
- Code, test and document new applications as well as changes to existing system functionality and ensure successful completion
- Take on leadership roles as needed
Skills & Requirements
- Bachelor’s Degree required, preferably in Computer Science or related field
- 3+ years of software development experience using GoLang/Java programming language
- Experience with cloud technologies (AWS/Azure/GCP/Pivotal Cloud Foundry/any private cloud) and containerization is required
- Experience with a micro-services architecture is a plus
- Excellent communication, collaboration, reporting, analytical and problem solving skills
- Experience with PostgreSQL or other Relational Databases
- Test-driven development mindset and a focus on quality, scalability and performance
- Strong programming fundamentals and ability to produce high quality code
- Solid understanding of Agile (SCRUM) Development Process required








