Cutshort logo

11+ GL Jobs in Pune | GL Job openings in Pune

Apply to 11+ GL Jobs in Pune on CutShort.io. Explore the latest GL Job opportunities across top companies like Google, Amazon & Adobe.

icon
Bengaluru (Bangalore), Mumbai, Pune, Hyderabad, Gurugram, Chennai
5 - 10 yrs
Best in industry
GL
P2P
PPM
Core HR
Compensation
+1 more

·         Design, Develop, and Implement Integration solutions using Oracle Integration cloud (OIC) to connect various applications, Systems, and Services

·         Customize and configure OIC adapters, connectors, and Components to meet specific integration requirements.

·         Develop RESTful and SOAP Webservices for data exchange and communication between different systems.

·         Good Knowledge in Cloud technologies (Lamda function in AWS etc for integration with AWS)

·         Collaborate with Business analysists and stakeholders to gather requirements and define integration workflows and data flows.

·         Perform troubleshooting, debugging and performance tuning of integration solutions to ensure optimal performance and reliability.

·         Develop and maintain documentation for integration processes, interfaces and configurations.

·         Perform Code reviews .

·         Ensure adherence to coding standards , development methodologies and security protocols throughout the software development lifecycle.

Personnel Specification

Education: Bachelor’s degree in computer science, Information Technology or related field

 

Experience:

• 5 or more years of experience in IT industry.

• Experience in Cloud based integration Platform.

 

Skills and Abilities:

·         Proven experience in designing, developing, and implementing integration solutions using OIC

·         Strong understanding of RESTful and SOAP web services, JSON, and other data formats.

·         Experience in cloud based integration platforms, writing Lamda function and creating integrations with various channels.

·         Strong Knowledge in OIC API Integration.

·         strong understanding on SOAP based services and Rest bases services.

·         Strong development skills in JAVA

·         Strong knowledge on Authentication methodologies to be followed during the integration platforms.

·         Strong Knowledge in OIC GEN2 and GEN3

Read more
Hunarstreet Technologies pvt ltd
Pune, Mumbai, Mohali, Bengaluru (Bangalore), Hyderabad, Panchkula
6 - 10 yrs
₹18L - ₹23L / yr
Salesforce Lightning
System migration


Title: Salesforce Solution Architect (Classic to Lightning Migration)


Summary:

We are seeking a Salesforce Solution Architect to lead AvalonBay’s transition from Salesforce Classic to Lightning. This role is responsible for designing scalable, user-friendly solutions using native Salesforce capabilities, reducing custom technical debt, and enabling a unified associate experience across resident services and call center teams.


Key Responsibilities:

● Own the end-to-end migration strategy from Classic to Lightning.

● Redesign data model (replace custom objects with standard Lead/Opportunity where

applicable).

● Configure Lightning features: Dynamic Forms, Lightning Pages, Flows, Omni-Channel.

● Recommend and implement AppExchange solutions where needed.

● Collaborate with developers, admins, and QA to deliver incremental releases.

● Ensure compliance with security, profiles, and permission sets.

● Act as the trusted advisor to stakeholders for Salesforce roadmap planning.


Qualifications:

● 6+ years Salesforce experience, with at least 3+ in Lightning implementations.

● Expertise in Service Cloud, Marketing Cloud, case management, and digital engagement.

● Proven experience with Classic-to-Lightning migration projects.

● Strong understanding of low-code/no-code capabilities (Flows, Dynamic Actions).

● Salesforce Certified Application/Solution Architect (preferred).

● Excellent communication and stakeholder management skills.

Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Pune, Bhopal, Jaipur, Gurugram, Bengaluru (Bangalore)
5 - 6 yrs
₹3L - ₹12L / yr
skill iconReact.js
skill iconHTML/CSS
skill iconJavascript

Location: Gurgaon, Pune, Bhopal, Jaipur, Chennai, Bangalore

Work Mode: Hybrid (2 days WFO)

Experience Required: 5+ Years

Notice Period: Immediate Joiners Preferred


Job Overview

We are looking for a skilled and experienced ReactJS Developer to join our growing engineering team at Deqode. The ideal candidate will have a solid foundation in front-end development and a passion for building scalable, user-friendly web applications.


Key Responsibilities

  • Develop responsive and modern web applications using ReactJS.
  • Collaborate with cross-functional teams including UI/UX, backend, and QA.
  • Build reusable components and front-end libraries for future use.
  • Optimize applications for maximum speed and scalability.
  • Integrate RESTful APIs and manage application state using Redux or Context API.
  • Write clean, maintainable, and well-documented code.
  • Perform unit testing and participate in code reviews.


Required Skills

  • Strong proficiency in ReactJS, JavaScript (ES6+), and TypeScript.
  • Deep understanding of React lifecycle methods and hooks.
  • Experience with Redux, Context API, or similar state management tools.
  • Knowledge of HTML5, CSS3, SASS, and responsive design principles.
  • Familiarity with build tools like Webpack, Babel, and version control (Git).
  • Experience integrating third-party APIs and libraries.
  • Knowledge of Agile/Scrum methodologies.


Must-Have Qualifications

  • Bachelor’s degree in Computer Science, Engineering, or related field.
  • 5+ years of experience in front-end development.
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and teamwork abilities.


Read more
Information Technology Services

Information Technology Services

Agency job
via Jobdost by Sathish Kumar
Pune
5 - 9 yrs
₹10L - ₹30L / yr
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Windows Azure
+8 more
Preferred Education & Experience: 
• Bachelor’s or master’s degree in Computer Engineering,
Computer Science, Computer Applications, Mathematics, Statistics or related technical field or
equivalent practical experience. Relevant experience of at least 3 years in lieu of above if from a
different stream of education.
• Well-versed in DevOps principals & practices and hands-on DevOps
tool-chain integration experience: Release Orchestration & Automation, Source Code & Build
Management, Code Quality & Security Management, Behavior Driven Development, Test Driven
Development, Continuous Integration, Continuous Delivery, Continuous Deployment, and
Operational Monitoring & Management; extra points if you can demonstrate your knowledge with
working examples.
• Hands-on experience with demonstrable working experience with DevOps tools
and platforms viz., Slack, Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory,
Terraform, Ansible/Chef/Puppet, Spinnaker, Tekton, StackStorm, Prometheus, Grafana, ELK,
PagerDuty, VictorOps, etc.
• Well-versed in Virtualization & Containerization; must demonstrate
experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox,
Vagrant, etc.
• Well-versed in AWS and/or Azure or and/or Google Cloud; must demonstrate
experience in at least FIVE (5) services offered under AWS and/or Azure or and/or Google Cloud in
any categories: Compute or Storage, Database, Networking & Content Delivery, Management &
Governance, Analytics, Security, Identity, & Compliance (or) equivalent demonstratable Cloud
Platform experience.
• Well-versed with demonstrable working experience with API Management,
API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption, tools &
platforms.
• Hands-on programming experience in either core Java and/or Python and/or JavaScript
and/or Scala; freshers passing out of college or lateral movers into IT must be able to code in
languages they have studied.
• Well-versed with Storage, Networks and Storage Networking basics
which will enable you to work in a Cloud environment.
• Well-versed with Network, Data, and
Application Security basics which will enable you to work in a Cloud as well as Business
Applications / API services environment.
• Extra points if you are certified in AWS and/or Azure
and/or Google Cloud.
Read more
EnterpriseMinds

at EnterpriseMinds

2 recruiters
Rani Galipalli
Posted by Rani Galipalli
Pune, Mumbai
5 - 8 yrs
₹1L - ₹16L / yr
Communication Skills
Business Analysis
Visio

Job Description:

  • 5-8 years of experience as business analyst preferably with some experience in healthcare domain
  • Experience in capturing/ defining requirements and creating business requirements document/ software requirements specifications
  • Extensive and hands on experience in creating wireframes using tools like Visio etc.
  • Experience in communicating with stakeholders across the board, including clients, business team, and the development team to make sure the goals are clear and the vision is aligned with business objectives
  • Experience as a BA in software development/ implementation projects is a must
  • Experience in creating and prioritizing the list of user stories/ backlog items and prioritize them based on the overall strategy and business objectives
  • Experience in overseeing the development of the product and taking necessary steps to keep the project on track
  • Strong communication skills, documentation skills. Excellent PPT, word and excel skills are a must

 Good to have

  • Have worked with hospitals or outpatient clinics or public sector healthcare entities
  • BA may have to travel for a few weeks Internationally

 

Read more
one of the world's leading multinational investment bank

one of the world's leading multinational investment bank

Agency job
via HiyaMee by Lithin Raj
Pune
6 - 13 yrs
₹8L - ₹15L / yr
ITIL
ServiceNow
Linux/Unix
Shell Scripting
Production support
+2 more
  • Provide hands on technical support and post-mortem root cause analysis using ITIL standards of Incident Management, Service Request fulfillment, Change Management, Knowledge Management, and Problem Management.
  • Actively address and work on user and system tickets in the Service Now ticketing application. Create and implement change tickets for enhancements, new monitoring, and assisting development groups.
  • Create, test, and implement Non-Functional Requirements (NFR) for current and new applications.
  • Build up technical subject matter expertise on the applications being supported including business flows, application architecture, and hardware configuration. Maintain documentation, knowledge articles, and runbooks.
  • Conduct real time monitoring to ensure application OLA/SLAs are achieved and maximum application availability (up time) using an array of monitoring tools.
  • Assist in the process to approve application code releases change tickets as well as tasks assigned to the support team to perform and validate the associated implementation plan.
  • Approach support with a proactive attitude, desire to seek root cause, in-depth analysis and triage, and strive to reduce inefficiencies and manual efforts.
Assist in special projects and view them as opportunities to enhance your skillset and develop your growth. These projects can include coding using shell scripting, Python and YAML language for support functions.
Read more
xpressbees
Pune, Bengaluru (Bangalore)
6 - 8 yrs
₹15L - ₹25L / yr
skill iconData Science
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
Artificial Intelligence (AI)
+6 more
Company Profile
XressBees – a logistics company started in 2015 – is amongst the fastest growing companies of its sector. Our
vision to evolve into a strong full-service logistics organization reflects itself in the various lines of business like B2C
logistics 3PL, B2B Xpress, Hyperlocal and Cross border Logistics.
Our strong domain expertise and constant focus on innovation has helped us rapidly evolve as the most trusted
logistics partner of India. XB has progressively carved our way towards best-in-class technology platforms, an
extensive logistics network reach, and a seamless last mile management system.
While on this aggressive growth path, we seek to become the one-stop-shop for end-to-end logistics solutions. Our
big focus areas for the very near future include strengthening our presence as service providers of choice and
leveraging the power of technology to drive supply chain efficiencies.
Job Overview
XpressBees would enrich and scale its end-to-end logistics solutions at a high pace. This is a great opportunity to join
the team working on forming and delivering the operational strategy behind Artificial Intelligence / Machine Learning
and Data Engineering, leading projects and teams of AI Engineers collaborating with Data Scientists. In your role, you
will build high performance AI/ML solutions using groundbreaking AI/ML and BigData technologies. You will need to
understand business requirements and convert them to a solvable data science problem statement. You will be
involved in end to end AI/ML projects, starting from smaller scale POCs all the way to full scale ML pipelines in
production.
Seasoned AI/ML Engineers would own the implementation and productionzation of cutting-edge AI driven algorithmic
components for search, recommendation and insights to improve the efficiencies of the logistics supply chain and
serve the customer better.
You will apply innovative ML tools and concepts to deliver value to our teams and customers and make an impact to
the organization while solving challenging problems in the areas of AI, ML , Data Analytics and Computer Science.
Opportunities for application:
- Route Optimization
- Address / Geo-Coding Engine
- Anomaly detection, Computer Vision (e.g. loading / unloading)
- Fraud Detection (fake delivery attempts)
- Promise Recommendation Engine etc.
- Customer & Tech support solutions, e.g. chat bots.
- Breach detection / prediction
An Artificial Intelligence Engineer would apply himself/herself in the areas of -
- Deep Learning, NLP, Reinforcement Learning
- Machine Learning - Logistic Regression, Decision Trees, Random Forests, XGBoost, etc..
- Driving Optimization via LPs, MILPs, Stochastic Programs, and MDPs
- Operations Research, Supply Chain Optimization, and Data Analytics/Visualization
- Computer Vision and OCR technologies
The AI Engineering team enables internal teams to add AI capabilities to their Apps and Workflows easily via APIs
without needing to build AI expertise in each team – Decision Support, NLP, Computer Vision, for Public Clouds and
Enterprise in NLU, Vision and Conversational AI.Candidate is adept at working with large data sets to find
opportunities for product and process optimization and using models to test the effectiveness of different courses of
action. They must have knowledge using a variety of data mining/data analysis methods, using a variety of data tools,
building, and implementing models, using/creating algorithms, and creating/running simulations. They must be
comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion
for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes.

Roles & Responsibilities
● Develop scalable infrastructure, including microservices and backend, that automates training and
deployment of ML models.
● Building cloud services in Decision Support (Anomaly Detection, Time series forecasting, Fraud detection,
Risk prevention, Predictive analytics), computer vision, natural language processing (NLP) and speech that
work out of the box.
● Brainstorm and Design various POCs using ML/DL/NLP solutions for new or existing enterprise problems.
● Work with fellow data scientists/SW engineers to build out other parts of the infrastructure, effectively
communicating your needs and understanding theirs and address external and internal shareholder's
product challenges.
● Build core of Artificial Intelligence and AI Services such as Decision Support, Vision, Speech, Text, NLP, NLU,
and others.
● Leverage Cloud technology –AWS, GCP, Azure
● Experiment with ML models in Python using machine learning libraries (Pytorch, Tensorflow), Big Data,
Hadoop, HBase, Spark, etc
● Work with stakeholders throughout the organization to identify opportunities for leveraging company data to
drive business solutions.
● Mine and analyze data from company databases to drive optimization and improvement of product
development, marketing techniques and business strategies.
● Assess the effectiveness and accuracy of new data sources and data gathering techniques.
● Develop custom data models and algorithms to apply to data sets.
● Use predictive modeling to increase and optimize customer experiences, supply chain metric and other
business outcomes.
● Develop company A/B testing framework and test model quality.
● Coordinate with different functional teams to implement models and monitor outcomes.
● Develop processes and tools to monitor and analyze model performance and data accuracy.
● Develop scalable infrastructure, including microservices and backend, that automates training and
deployment of ML models.
● Brainstorm and Design various POCs using ML/DL/NLP solutions for new or existing enterprise problems.
● Work with fellow data scientists/SW engineers to build out other parts of the infrastructure, effectively
communicating your needs and understanding theirs and address external and internal shareholder's
product challenges.
● Deliver machine learning and data science projects with data science techniques and associated libraries
such as AI/ ML or equivalent NLP (Natural Language Processing) packages. Such techniques include a good
to phenomenal understanding of statistical models, probabilistic algorithms, classification, clustering, deep
learning or related approaches as it applies to financial applications.
● The role will encourage you to learn a wide array of capabilities, toolsets and architectural patterns for
successful delivery.
What is required of you?
You will get an opportunity to build and operate a suite of massive scale, integrated data/ML platforms in a broadly
distributed, multi-tenant cloud environment.
● B.S., M.S., or Ph.D. in Computer Science, Computer Engineering
● Coding knowledge and experience with several languages: C, C++, Java,JavaScript, etc.
● Experience with building high-performance, resilient, scalable, and well-engineered systems
● Experience in CI/CD and development best practices, instrumentation, logging systems
● Experience using statistical computer languages (R, Python, SLQ, etc.) to manipulate data and draw insights
from large data sets.
● Experience working with and creating data architectures.
● Good understanding of various machine learning and natural language processing technologies, such as
classification, information retrieval, clustering, knowledge graph, semi-supervised learning and ranking.

● Knowledge and experience in statistical and data mining techniques: GLM/Regression, Random Forest,
Boosting, Trees, text mining, social network analysis, etc.
● Knowledge on using web services: Redshift, S3, Spark, Digital Ocean, etc.
● Knowledge on creating and using advanced machine learning algorithms and statistics: regression,
simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc.
● Knowledge on analyzing data from 3rd party providers: Google Analytics, Site Catalyst, Core metrics,
AdWords, Crimson Hexagon, Facebook Insights, etc.
● Knowledge on distributed data/computing tools: Map/Reduce, Hadoop, Hive, Spark, MySQL, Kafka etc.
● Knowledge on visualizing/presenting data for stakeholders using: Quicksight, Periscope, Business Objects,
D3, ggplot, Tableau etc.
● Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural
networks, etc.) and their real-world advantages/drawbacks.
● Knowledge of advanced statistical techniques and concepts (regression, properties of distributions,
statistical tests, and proper usage, etc.) and experience with applications.
● Experience building data pipelines that prep data for Machine learning and complete feedback loops.
● Knowledge of Machine Learning lifecycle and experience working with data scientists
● Experience with Relational databases and NoSQL databases
● Experience with workflow scheduling / orchestration such as Airflow or Oozie
● Working knowledge of current techniques and approaches in machine learning and statistical or
mathematical models
● Strong Data Engineering & ETL skills to build scalable data pipelines. Exposure to data streaming stack (e.g.
Kafka)
● Relevant experience in fine tuning and optimizing ML (especially Deep Learning) models to bring down
serving latency.
● Exposure to ML model productionzation stack (e.g. MLFlow, Docker)
● Excellent exploratory data analysis skills to slice & dice data at scale using SQL in Redshift/BigQuery.
Read more
Humancloud Technology Pvt Ltd
Shabbir Shaikh
Posted by Shabbir Shaikh
Pune, Hyderabad, Bengaluru (Bangalore), Aurangabad
2 - 7 yrs
₹1L - ₹15L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Job Description : Java Developer
We are looking for an experienced Java Developer who will work closely with the
technical lead to identify and establish best practices in the company.

Requirements & Responsibilities :
● Design and develop features using Core Java, Spring Boot, and Hibernate
● Ability to design database schema, develop views and stored procedures
● Participate in user story grooming, design discussions and proposal of solutions
● Maintain existing software systems by identifying and correcting software defects
● Practice standard development process leveraging agile methodologies such as
SCRUM and TDD
● Review and analyze business requirements and provide technical feasibility and
estimates
● Manage development / support functions etc
● Excellent in OOPS concepts, system design
● Strong knowledge of Core Java, Spring, Hibernate and Microservices
● Hands-on experience in DB design, SQL, UI Technologies like HTML/CSS,
JavaScript, jQuery, etc.
● Good knowledge of design patterns
● Excellent knowledge of JSP, Servlets, WebServices, JUnit
● Experience in Agile software development
● Familiarity with JIRA, GIT, Maven
● Experience in working directly with a client
● Good knowledge in requirement gathering, analysis, and designing
Read more
Infogain
Agency job
via Technogen India PvtLtd by RAHUL BATTA
Bengaluru (Bangalore), Pune, Noida, NCR (Delhi | Gurgaon | Noida)
7 - 10 yrs
₹20L - ₹25L / yr
Data engineering
skill iconPython
SQL
Spark
PySpark
+10 more
  1. Sr. Data Engineer:

 Core Skills – Data Engineering, Big Data, Pyspark, Spark SQL and Python

Candidate with prior Palantir Cloud Foundry OR Clinical Trial Data Model background is preferred

Major accountabilities:

  • Responsible for Data Engineering, Foundry Data Pipeline Creation, Foundry Analysis & Reporting, Slate Application development, re-usable code development & management and Integrating Internal or External System with Foundry for data ingestion with high quality.
  • Have good understanding on Foundry Platform landscape and it’s capabilities
  • Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
  • Defines company data assets (data models), Pyspark, spark SQL, jobs to populate data models.
  • Designs data integrations and data quality framework.
  • Design & Implement integration with Internal, External Systems, F1 AWS platform using Foundry Data Connector or Magritte Agent
  • Collaboration with data scientists, data analyst and technology teams to document and leverage their understanding of the Foundry integration with different data sources - Actively participate in agile work practices
  • Coordinating with Quality Engineer to ensure the all quality controls, naming convention & best practices have been followed

Desired Candidate Profile :

  • Strong data engineering background
  • Experience with Clinical Data Model is preferred
  • Experience in
    • SQL Server ,Postgres, Cassandra, Hadoop, and Spark for distributed data storage and parallel computing
    • Java and Groovy for our back-end applications and data integration tools
    • Python for data processing and analysis
    • Cloud infrastructure based on AWS EC2 and S3
  • 7+ years IT experience, 2+ years’ experience in Palantir Foundry Platform, 4+ years’ experience in Big Data platform
  • 5+ years of Python and Pyspark development experience
  • Strong troubleshooting and problem solving skills
  • BTech or master's degree in computer science or a related technical field
  • Experience designing, building, and maintaining big data pipelines systems
  • Hands-on experience on Palantir Foundry Platform and Foundry custom Apps development
  • Able to design and implement data integration between Palantir Foundry and external Apps based on Foundry data connector framework
  • Hands-on in programming languages primarily Python, R, Java, Unix shell scripts
  • Hand-on experience in AWS / Azure cloud platform and stack
  • Strong in API based architecture and concept, able to do quick PoC using API integration and development
  • Knowledge of machine learning and AI
  • Skill and comfort working in a rapidly changing environment with dynamic objectives and iteration with users.

 Demonstrated ability to continuously learn, work independently, and make decisions with minimal supervision

Read more
Plivo

at Plivo

11 recruiters
Anusha Reddy
Posted by Anusha Reddy
Remote, Bengaluru (Bangalore), Hyderabad, Chennai, Pune, Noida, NCR (Delhi | Gurgaon | Noida)
2 - 5 yrs
₹9L - ₹17L / yr
Software Development
SDET
Software Testing (QA)
Test Automation (QA)
skill iconPython
+5 more
  • 2-4  years of experience, with at least 1-2 years of experience in testing Web Applications/Services, UI testing.
  • Good exposure to Non-functional tests strategies like Load, Perf and Chaos.
  • Good understanding of testing principles: UI and Usability testing, Stress testing, Functional/Regression testing, Code coverage, TDD/BDD, UAT.
  • Experience in deploying solutions into AWS is a major plus.
  • Good team player, having effective and professional oral and written communication skills.
Read more
AMBC Technologies Pvt Ltd
Ponmuthumari Mohan
Posted by Ponmuthumari Mohan
Pune
2 - 7 yrs
₹70000 - ₹90000 / mo
skill icon.NET
JSON
JWT
 .Net Core 2.x Development Experience  Create WebAPI in dotnet core Secure Application  Know how to work JWT for security  Sql Database Experience including  Create and alter StoredProcedures, Views and schema.  Create queries to validate data  Angular JS
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort