11+ SAP EAM Jobs in Pune | SAP EAM Job openings in Pune
Apply to 11+ SAP EAM Jobs in Pune on CutShort.io. Explore the latest SAP EAM Job opportunities across top companies like Google, Amazon & Adobe.
Skills:
Enterprise Asset Management/ Work Management
IBM Maximo,SAP IAM,Infor EAM
Job Responsibilities will include:
-----------------------------------------------
- Overall experience of 5-10 years
- Strong Functional knowledge in Enterprise Asset Management/ Work Management
- Hands on experience in IBM Maximo or SAP IAM or Infor EAM and its integration with supporting systems
- Experience in Utilities industry clients (Electricity or Gas or Water)
- Experience in Supply Chain Domain will be a plus
- Full time regular MBA
- A bachelor's degree in Engineering
Role Overview
You will play a crucial role in performance tuning of the product, focusing on response time, load, and scalability. The role demands hands-on expertise in performance testing tools and strong troubleshooting skills to collaborate effectively with development and architecture teams.
Key Responsibilities
- Design, develop, and execute performance test scripts using tools such as JMeter, LoadRunner, or RPT.
- Conduct multi-user scenario scripting and load/stress testing.
- Analyze performance test results and provide bottleneck analysis and recommendations.
- Collaborate with developers and architects to optimize performance across response time, scalability, and throughput.
- Monitor system health during performance testing and troubleshoot performance issues.
- Document test strategy, results, and provide actionable insights to improve product performance.
- Contribute to performance tuning and capacity planning for cloud-based applications (added advantage).
Required Skills & Experience
- 6–8 years of overall engineering product testing experience.
- At least 2 years in automation testing.
- At least 2 years in performance test analysis.
- Hands-on expertise in JMeter, RPT, or LoadRunner (mandatory).
- Strong background in performance engineering with the ability to troubleshoot and solve technical issues.
- Experience in cloud-based applications (preferred).
- Knowledge of C++ coding will be an added advantage.
- Excellent skills in performance monitoring, profiling, and analysis.
- Strong communication skills for technical discussions with development and architecture teams.
Job Location: Pune
Experience: 4- 5 years
Functional Area - IT Software - Application Programming , Maintenance
Role Category : Programming & Design
Requirement / Job Description:
Core Skills:
Strong experience of Core Java (1.7 or higher), OOPS concepts and Spring framework (Core, AOP, Batch, JMS)
Demonstrated design using Web Services (SOAP and REST)
Demonstrated Microservices APIs design experience using Spring, Springboot
Demonstrable experience in Databases like MySQL, PostgreSQL, Oracle PL/SQL development etc
Strong coding skills, good analytical and problem-solving skills
Excellent understanding of Authentication, Identity Management, REST APIs, security and best practices
Good understanding of web servers like Tomcat Apache, nginx or Vertex/ Grizzly, JBoss etc
Experience in OAuth principles
Strong understanding of various Design patterns
Other Skills:
Familiarity with Java Cryptography Architecture (JCA)
Understanding of API Gateways like Zuul, Eureka Server etc..
Familiarity with Apache Kafka, MQTT etc.
Responsibilities:
Design, develop, test and debug software modules for an enterprise security product
Find areas of optimization and produce high quality code
Collaborate with product managers and other members of the project team in requirements specification and detailed engineering analysis.
Collaborate with various stake holders and help bring proactive closure on the issues
Evaluate various technology trends and bring in the best practices
Innovate and come out of the box solutions
Adapt, thrive and deliver in a highly evolving and demanding product development team
Come up with ways to provide an improved customer experience
About the Company
Peacock Engineering Ltd is a Gold-accredited IBM Premier Business Partner which has amassed over 300 person years of experience implementing business critical EAM (Enterprise Asset Management) solutions across a range of industries such as oil & gas, pharmaceuticals, utilities, facilities management, transport, and power generation.
Peacock Engineering Ltd specialise in providing consultancy services and support for the IBM Maximo EAM software product and maintain a pool of highly experienced and capable consultants fully conversant with IBM Maximo and its functionality, capabilities, and opportunities for customisation to meet business need.
Main Purpose:
Peacock Engineering’s Technical Services team is now looking for an experienced UI / Front End Developer who is proficient with React.js (16.8+) to join our international team of developers delivering innovative solutions to our major UK-based customers.
Your primary focus will be working on new user interface components which are modern, secure, performant, and easy to maintain - following well-known React.js workflows and recognised best practices.
You will coordinate with the rest of our multi-disciplined team working together on different layers of the solution architecture. A commitment to collaborative problem solving, sophisticated design, and delivering a high-quality product is essential.
Specific Responsibilities:
- Developing new user-facing features for our clients using React.js
- Translating functional requirements (User Stories/Tasks) and wireframes into high quality code with tests
- Working with architects, developers, and QA engineers to ensure that your work is testable, meets industry security standards and is written to deliver good performance/scalability.
- Perform application and solution development to meet project requirements.
- Develop and document detailed technical designs to meet business requirements.
- Manage multiple technical environments and support the development and testing processes.
- Identify areas of customization and optimization and provide solutions that meet the business requirements.
Skills & Personal Qualities – Required:
Experience working with the IBM Maximo software product within the following capacities:
- Tech. in Computer Science, Engineering or Business-related field and/or equivalent work experience.
- Thorough understanding of React.js and its core principles
- Minimum five (5) years of work experience in React application development.
- Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object model
- Demonstrable expertise in software development in an Agile setting
- Ability to deliver well-tested code consistently in an Agile, CI/CD environment
- Experience with JavaScript Testing frameworks and principles (Jest preferable)
- Familiarity with newer specifications of ECMAScript
- Familiarity with RESTful APIs
- Knowledge of modern authentication/authorization mechanisms
- Familiarity with modern build pipelines and tools (Azure DevOps preferable)
- Experience with common front-end development tools such as Babel, Webpack/Parcel, NPM/Yarn, etc.
- Familiarity with Git
- Good time-management skills
- Great interpersonal and communication skills
- Good spoken & written English
Skills & Personal Qualities – Desired:
- To bring industry knowledge world class capabilities innovation and cutting-edge technology to our clients in the Resources industry to deliver business value.
- To work with leading Resources client’s major customers and suppliers to develop and execute projects and reliability strategies.
- To harness extensive knowledge combined with an integrated suite of methods people and assets to deliver sustainable long-term solution.
- IBM MobileFirst certification
- JAVA/ SQL Skills
Person Specification/Attributes:
- Professional and committed, with a disciplined approach to work.
- Motivated and driven by finding and providing solutions to problems.
- Polite, tactful, helpful, empathic nature, able to deliver to the needs of customers.
- Has respect for others and their views.
- Technology minded and focused, enthusiastic about technologies.
- Analytical, able to raise from the detail and see the bigger picture.
- Dedicated to continually updating and upgrading own knowledge.
- Carries a mind-set of continuous improvement, constantly looking for better and more efficient ways of doing things.
- Values quality at the centre of all things in work.
Due to considerable amounts of virtual working and interaction with colleagues and customers in different physical locations internationally, it is essential that the successful applicant has the drive and ethic to succeed working in small teams physically but in larger efforts virtually. Self-drive to communicate constantly using web collaboration and video conferencing is essential.
As an employee, you will be encouraged to continually develop your capability & attain certifications to reflect your growth as an individual.
Must-Have:
- Core Java (Must be good in Core Java concepts, Java programming practices, Clean coding.)
- Spring Framework(Must have working experience in Spring Core)
- Any SQL framework(hands-on working experience in any of these: Hibernate/JPA/MyBatis/Spring Data/Others)
- Hands-on experience in REST API
- Hands-on experienceSQL Database(any of the databases: Oracle/MySQL/PostgreSQL/SQLServer)
- Must have working experience with basic GIT
- Maven or Gradle (Must have working experience in building Java projects using maven or Gradle.)
- Should be able to work independently under someone's guidance.
Good to have:
- Advanced Java(Threading, Performance optimization)
- Spring Boot
- Other Spring frameworks(Spring Security, Spring Batch, Others)
- Microservices - Any NoSQL database(MongoDB, Cassandra, others)
- Application designing concepts.
Job title: Azure Architect
Locations: Noida, Pune, Bangalore and Mumbai
Responsibilities:
- Develop and maintain scalable architecture, database design and data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity
- Design and Develop the Data lake, Data warehouse using Azure Cloud Services
- Assist in designing end to end data and Analytics solution architecture and perform POCs within Azure
- Drive the design, sizing, POC setup, etc. of Azure environments and related services for the use cases and the solutions
- Reviews the solution requirements support architecture design to ensure the selection of appropriate technology, efficient use of resources and integration of multiple systems and technology.
- Must possess good client-facing experience with the ability to facilitate requirements sessions and lead teams
- Support internal presentations to technical and business teams
- Provide technical guidance, mentoring and code review, design level technical best practices
Experience Needed:
- 12-15 years of industry experience and at least 3 years of experience in architect role is required along with at least 3 to 4 years’ experience designing and building analytics solutions in Azure.
- Experience in architecting data ingestion/integration frameworks capable of processing structured, semi-structured & unstructured data sets in batch & real-time
- Hands-on experience in the design of reporting schemas, data marts and development of reporting solutions
- Develop batch processing, streaming and integration solutions and process Structured and Non-Structured Data
- Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and Azure Analysis Services and other ETL technologies.
- Experience in Perform Design, Development & Deployment using Azure Services ( Azure Synapse, Data Factory, Azure Data Lake Storage, Databricks, Python and SSIS)
- Worked with transactional, temporal, time series, and structured and unstructured data.
- Deep understanding of the operational dependencies of applications, networks, systems, security, and policy (both on-premise and in the cloud; VMs, Networking, VPN (Express Route), Active Directory, Storage (Blob, etc.), Windows/Linux).
Mandatory Skills: Azure Synapse, Data Factory, Azure Data Lake Storage, Azure DW, Databricks, Python




