11+ SAP PI Jobs in Bangalore (Bengaluru) | SAP PI Job openings in Bangalore (Bengaluru)
Apply to 11+ SAP PI Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest SAP PI Job opportunities across top companies like Google, Amazon & Adobe.

Working in SAP PI Architect / AIF
Location -Bangalore (White field)
Regards
Ragu
- Experience in Core Java, CXF, Spring.
- Experience in spring boot, microservices.
- Extensive experience in developing enterprise-scale n-tier applications for the financial domain. Should possess good architectural knowledge and be aware of enterprise application design patterns.
- Should have the ability to analyze, design, develop and test complex, low-latency client-facing applications.
- Good development experience with RDBMS, preferably Sybase database
- Good knowledge of multi-threading and high volume server side development
- Experience in sales and trading platforms in investment banking/capital markets
- Basic working knowledge of Unix/Linux
- Excellent problem solving and coding skills in Java
- Strong interpersonal, communication and analytical skills.
- Should have the ability to express their design ideas and thoughts.

Job Description:
Oracle EBS Technical
Senior Program Analysts having 3+ years of experience working on Oracle EBS (release 12+) Financial modules (Projects, Purchasing optional)
Expertise in D2K forms, Reports, XML reports, PLSQL, API/Interface development, OAF knowledge is optional
Should have knowledge of Financial modules (GL, AP, AR)
Worked on interfaces and API
Can work independently once requirements are shared. Able to navigate and test functionality in Oracle EBS without support
Additional information
Good communication skills and accounting knowledge is must

- 1-3 years of experience in application/web development
- Good experience in working with programming languages Golang.
- Good understanding of REST APIs and the web in general
- Working knowledge of AWS and Kubernetes is a plus
- Understand end user requirements, formulate use cases and come up with effective solutions

About the Role
As one of the key members of the development team, you will have the unique opportunity to redefine the architecture of our suite of products.
You will get to work directly with our founding team to deliver the most valuable and joyful experience to our customers. If you are looking to make a real impact on real people’s lives and accelerate your career to new heights in the meantime, then this is the perfect opportunity for you. You will help in refactoring certain codes to bring greater flexibility and micro service architecture. CurbWaste intends to execute event driven workflow architecture.Benchmark design patterns for security and scalability will need to be implemented.
Requirements
What you will do
• Review current code and anticipate engineering bottlenecks
• Designing and developing REST API interfaces
• Optimize queries
• Design SOLR based search solution
• Code review peer code
• Identifying code libraries and design patterns
What you will need
• Experience building out RESTful APIs for front-end clients
• Basic knowledge of a minimum one modern front-end framework such as React,Polymer, Angular or Vue.js
• Expert level understanding of NodeJS, and frameworks such as ExpressJS, Fast,LoopBack (preferred)
• Experience with a version control tool (we use git - GitHub and BitBucket)
• Familiarity with modern DevOps tools such as Ansible, Docker, Terraform,Fabric, Kubernetes, etc
• SOLR or ElasticSearch experience
• Advanced Knowledge of NoSQL (also SQL) databases - MongoDB, PostgreSQL
• Extensive experience of any caching technologies - Redis (preferred),
Memcached
• Experience with AWS services like Elastic Beanstalk, S3, EC2 Lambda, API Gateway, SQS, etc
• Prior experience in notifications delivery tools - FCM
• Understanding of patterns and techniques for building scalable back-end
infrastructure including caching, rate limiting, authentication, and authorization schemes
• Experience with programming languages such as golang, Typescrip


Quantum Corp. leads the world in helping users collaboratively solve some of the world's hardest computer challenges. Whether its wrangling Petabytes of data for cutting edge movie production, managing global content production workflows, exploring new sources of energy, or managing oceans of content and blending collaborative high-speed workflows with cutting-edge object storage, Quantum has the tools and technology to help you engineer the solution that works now - and preserves your work for decades to come. You are applying to an exceptional team that contributes to enhancing our position as a proven global expert in data management.
Job Summary and Duties:
This position will highlight new product development, as well as feature development and bug fixing of existing products serving the expanding big data, NAS, virtualization, replication, and file systems market. This position is required to be well rounded in operating system including all flavors of Linux, Windows, and MacOS. Building specialized core (user space and kernel space) software is the main function of this role.
Specific duties include but are not limited to:
• Development and maintenance of new Primary Storage products.
• Investigating and resolving issues in existing products related application workflows, cross-platform interactions, and new operating systems.
• Triage and disposition incoming issues from support cases, customer interactions, and new feature requests.
Job Requirements:
• Minimum 8 years of combined education and experience including 5 years of C/C++ programming.
• Expertise with SMB/NFS networking protocol stacks (Samba).
• Filesystem VFS layer expertise (Samba VFS modules).
• Experience with Python and Linux shell programming.
• Experience with SAN and Ethernet networking technologies.
• Ability to work in a team environment.
• Strong communication skills.
Desired Skills:
• Knowledge of kernel internals including any of the following: Linux and Mac OS X.
• Interfaces with special purpose file system APIs and web services required.
• Knowledge and experience with container technologies (Kubernetes, Docker)
• Knowledge and experience with Virtualization technologies: ESX, KVM
• Experience working in an Agile environment using CI methodologies.


autonomous world.Rich data in large volumes is getting collected at the edge (outside a datacenter) in use cases like autonomous vehicles, smart manufacturing, satellite imagery, smart retail, smart agriculture etc.These datasets are characterized by being unstructured
(images/videos), large size (Petabytes per month), distributed (across edge, on-prem and
cloud) and form the input for training AI models to get to higher degrees of automation.
Akridata is engaged with building products that solve these unique challenges and be at the forefront of this edge data revolution.
The company is backed by prominent VCs and has it’s entire software engineering team
based out of India and provides ample opportunities for from-scratch design and
development.
Role:
This role is an individual contributor role with key responsibilities in developing web server
backends for Akridata management plane software that provides a ‘single pane of glass’ for users to manage assets, specify and monitor large volume data pipelines at scale involving 10s of petabytes of data.
This role involves:
1. Working with tech leads and the rest of the team on the feature design activities and
picking appropriate tools and techniques for implementation.
2. Be a hands-on developer able to independently make correct implement choices, follow
sound development practices to ensure an enterprise grade application.
3. Guide and mentor junior team members.
What we are looking for:
1. A Bachelor’s or Master’s degree in computer science with strong CS fundamentals and
problem solving.
2. 5+ years of hands-on experience with software development with 3+ years on web
backend development.
3. A good understanding of backend application interactions with relational databases like
MySQL, Postgres etc
4. Knowledge of web server development frameworks preferably on Python.
5. Enthusiastic to work in a dynamic, fast paced startup environment.
Good to have:
1. Hands-on experience with designing database schema and implementing and debugging SQL queries for optimal performance for large datasets
2. Experience working with applications deployed on Kubernetes clusters.
3. Experience with working on a product from early stages of it’s development typically in a
startup environment.
Building highly-scalable and secure payments platform
Primary owners of one or more components of the platform and will drive
innovation in your area of ownership
Working with various product teams gathering requirements and adding capabilities
Working with some of the smartest people in the industry and will have ample
opportunity to learn and grow
Using cutting-edge cryptography to secure payments beyond industry standards
Deriving actionable insights by mining TBs of data
Building low-level infrastructure that aims to push the boundaries of network
performance
Participating actively in recruitment and nurturing of engineers as awesome as you.
What do we look for?
If you spend time cracking NP hard problems rather than cracking nuts, you are the
most laziest person and automates everything, you appreciate beauty of code
(bonus if you can sing “Finite simple group of order 2”), you should apply in t=0
Good understanding of Databases
Good understanding of networking (especially with HTTP)
Good understanding of OS concepts
2-4 years of experience
Should have hands on development experience with Object Oriented Programming
(Java is highly preferred) on a large scale system
Understands and showcase ownership of the products
Good with concepts of scaling and worked with distributed systems
Review and influence new evolving design, architecture, standards and methods
with stability, maintainability and scale in mind
Identify patterns and provide solutions to class of problems
Research, evaluate and socialize new tools, technologies, and techniques to improve
the value of the system
Be able to multi-task, prioritize and handle dependencies with minimal oversight
Role and Responsibilities
The candidate for the role will be responsible for enabling single view for the data from multiple sources.
- Work on creating data pipelines to graph database from data lake
- Design graph database
- Write Graph Database queries for front end team to use for visualization
- Enable machine learning algorithms on graph databases
- Guide and enable junior team members
Qualifications and Education Requirements
B.Tech with 2-7 years of experience
Preferred Skills
Must Have
Hands-on exposure to Graph Databases like Neo4J, Janus etc..
- Hands-on exposure to programming and scripting language like Python and PySpark
- Knowledge of working on cloud platforms like GCP, AWS etc.
- Knowledge of Graph Query languages like CQL, Gremlin etc.
- Knowledge and experience of Machine Learning
Good to Have
- Knowledge of working on Hadoop environment
- Knowledge of graph algorithms
- Ability to work on tight deadlines
