11+ BOOMI Jobs in Hyderabad | BOOMI Job openings in Hyderabad
Apply to 11+ BOOMI Jobs in Hyderabad on CutShort.io. Explore the latest BOOMI Job opportunities across top companies like Google, Amazon & Adobe.

Required Skills:
1. Strong hands-on experience on Boomi Process building & deployment, Boomi EDI Integration, Boomi API management, alert framework/exception handling, connectors/listeners, integration packs usage and all aspects of design, development, performance tuning and operational support of the software suite.
2. Define clients' integration requirements through analysis, and design
3. Provide technical direction and assistance to clients regarding their integration needs
4. Plan, design, implement and document integration processes varying in levels of complexity
5. Develop test plan specifications, test and debug processes according to plan. Work with customer on user acceptance testing.
6. Provide on-going education and technical assistance to current and prospective customers
7. Display initiative, self-motivation and deliver high quality work while at the same time, meeting all deadlines for both internal and external customers
8. Experience on multiple integration frameworks (TIBCO, Mulesoft, Oracle SOA etc) highly preferred.
9. Experience in implementing at least 1 to 2 full cycle of projects involving integration of Trading Partners, ERP and/or non-ERP systems, on-premise and Cloud/hybrid integration's.
10. Proven ability to design and optimize business processes and to integrate business processes across disparate systems.
11. Excellent analysis skills and the ability to develop processes and methodologies.
12. Candidate having experience in scripting (JavaScript/Groovy) is added advantage
13. Understanding and Knowledge of JSON, XML, Flat Files (CSV, Fixed-Width), EDI and Knowledge of enterprise systems (CRM , ERP [NetSuite]) and MDH
We are not looking for someone who is:
1. Never worked on customer facing role with international customers.
2. Never lead a team with minimum of three members
3. Never lead the high level design, Technical design discussions and can define work break down structure
4. Not flexible for 24x7 rotational shifts and not ready to work from office.
- Develop, enhance, and maintain Java-based applications using Spring Boot and related frameworks.
- Design, implement, and optimize Microservices with RESTful APIs.
- Build and manage Spring Batch jobs, including scheduling, chunk processing, partitioning, and error handling.
- Apply object-oriented design (OOD) and GoF design patterns (Factory, Singleton, Strategy, Observer, etc.).
- Write clean, maintainable, and scalable code following best coding standards.
- Integrate applications with databases (SQL/NoSQL) and messaging systems (Kafka/RabbitMQ).
- Participate in code reviews, technical discussions, and architectural decisions.
- Troubleshoot production issues and ensure application performance, scalability, and reliability.
- Work in an Agile/Scrum environment and collaborate with cross-functional teams.
Technical Skills Required
- ** Core Java 8+**, OOP, Collections, Multithreading
- ** Spring Boot**, Spring MVC, Spring Data JPA
- ** Spring Batch** (Job/Step configuration, Tasklets, Readers/Writers, partitioning)
- ** Microservices** (REST, API Gateway, service discovery, resilience patterns)
- ** Design Patterns** (Factory, Singleton, Adapter, Strategy, Builder, Observer, etc.)
- ** Databases:** MySQL/PostgreSQL/Oracle, MongoDB (optional)
- ** Messaging:** Kafka / RabbitMQ (preferred)
- ** Build tools:** Maven/Gradle
- ** CI/CD:** Jenkins, GitLab CI, or similar
- ** Cloud:** AWS / Azure / GCP (optional but preferred)
- ** Testing:** JUnit, Mockito
- Creating and managing ETL/ELT pipelines based on requirements
- Build PowerBI dashboards and manage datasets needed.
- Work with stakeholders to identify data structures needed for future and perform any transformations including aggregations.
- Build data cubes for real-time visualisation needs and CXO dashboards.
Required Tech Skills
- Microsoft PowerBI & DAX
- Python, Pandas, PyArrow, Jupyter Noteboks, ApacheSpark
- Azure Synapse, Azure DataBricks, Azure HDInsight, Azure Data Factory
Job Title: Mulesoft Lead/Architect
Location: Hyderabad/ Pune
Prolifics, a pioneering technology solutions provider, is seeking a talented and inventive Mulesoft developer to join our dynamic team. At Prolifics, we believe in empowering our employees to push the boundaries of innovation, think outside the box, and deliver game-changing solutions to our clients.
To excel in this role, you should have:
- 5+ years of hands-on experience with MuleSoft and API Management with Excellent knowledge of SOA, ESB concepts.
- Configure APIs, proxy endpoints, API portals and API analytics based on technical specifications with MuleSoft API manager.
- Deep understanding of VPC and DLB’s.
- Deep understanding of REST, HTTP, MQ, JSON, XML and SOA Design and develop enterprise services using RAML in Mule, REST based APIs, SOAP Web Services and use of different mule connectors.
- Strong understanding and experience with security implementations (e.g. SSL/mutual SSL,mTLS SAML, Auth, OAuth).
- Strong understanding and experience with Mule 4 & Dataweave Language
- Ability to interface with clients, technology partners, testing, architecture, and analysis groups.
- Implement Policies in API Manager
- Deep experience with Any point Platform, Flow Design, API Design. Data weave, Cloud Hub, Runtime Fabric, API Management
- Knowledge of Jenkins / CICD Process, AZURE/AWS CICD Process.
- Prior Experience in integration with different applications like SOAP, REST WS, SAP, Salesforce, DB, etc. through Mule soft connectors.
- Creation of Mapping document by working with source and target systems
- Deploy APIs to Cloud hub, Runtime Fabric, On-prem workers, etc.
- MuleSoft RTF Deployment experience a plus
- Prior Experience in Munit, Automation Testing using JMeter.
- SonarQube experience is a plus.
- Anypoint MQ experience is must.
- Understand / apply reusable code design, leverage application architecture / patterns, framework capabilities and functionality, and design / develop solutions that are highly reliable, scalable, and perform to meet business-defined service levels.
- Experiences with Splunk/ELK is a plus.
- Experience in working in Micro services development preferably in API implementation in MuleSoft.
- Excellent communication skills
- Excellent interpersonal and analytical skills
- Excellent attention to detail ability
- Must be a quick learner to gear up in the MuleSoft technology and architectural standards.
This role is for Work from the office.
Job Description
Roles & Responsibilities
- Work across the entire landscape that spans network, compute, storage, databases, applications, and business domain
- Use the Big Data and AI-driven features of vuSmartMaps to provide solutions that will enable customers to improve the end-user experience for their applications
- Create detailed designs, solutions and validate with internal engineering and customer teams, and establish a good network of relationships with customers and experts
- Understand the application architecture and transaction-level workflow to identify touchpoints and metrics to be monitored and analyzed
- Analytics and analysis of data and provide insights and recommendations
- Constantly stay ahead in communicating with customers. Manage planning and execution of platform implementation at customer sites.
- Work with the product team in developing new features, identifying solution gaps, etc.
- Interest and aptitude in learning new technologies - Big Data, no SQL databases, Elastic Search, Mongo DB, DevOps.
Skills & Experience
- At least 2+ years of experience in IT Infrastructure Management
- Experience in working with large-scale IT infra, including applications, databases, and networks.
- Experience in working with monitoring tools, automation tools
- Hands-on experience in Linux and scripting.
- Knowledge/Experience in the following technologies will be an added plus: ElasticSearch, Kafka, Docker Containers, MongoDB, Big Data, SQL databases, ELK stack, REST APIs, web services, and JMX.
Location: Bangalore/Pune/Hyderabad/Nagpur
4-5 years of overall experience in software development.
- Experience on Hadoop (Apache/Cloudera/Hortonworks) and/or other Map Reduce Platforms
- Experience on Hive, Pig, Sqoop, Flume and/or Mahout
- Experience on NO-SQL – HBase, Cassandra, MongoDB
- Hands on experience with Spark development, Knowledge of Storm, Kafka, Scala
- Good knowledge of Java
- Good background of Configuration Management/Ticketing systems like Maven/Ant/JIRA etc.
- Knowledge around any Data Integration and/or EDW tools is plus
- Good to have knowledge of using Python/Perl/Shell
Please note - Hbase hive and spark are must.
- Experience of supporting medium/large scale Support / implementation / SAP ECC on HANA .
- Worked with SAP MM, SD, FICO and HCM.Having experience in S4 HANA implementation.
- Absolute understanding of mapping technical designs to Functional Documents (FS) & creating/reviewing corresponding Technical Specs (TS).
- Detailed & exhaustive understanding of coding practices & naming conventions in ABAP.
- Deliver of New Business Request/ Change Request with good quality, the defined timeline and the defined budget Data Dictionary Concepts (Mandatory)Report development classical / Interactive / ALV ,SAP Script / Smart forms /Adobe Forms Module Pool ,SAP BAPI/RFC Performance optimizations
- Proficient in finding correct User Exit / SAP Enhancements Points and SAP BADI.IDOC Configuration, extensions, custom IDOC (Mandatory as working knowledge , if not complete hands-on)
- Expertise in Java server-side development: Core Java, Golang Servlets, Spring Core, React JS, Spring Boot with GRPC, and Hibernate.
- Exposure to microservice design patterns.
- Good knowledge in Data Structures, Algorithms, Object-Oriented Design, Analysis, Design patterns, and other computer science concepts.
- Knowledge of GraphQL, Kafka, PostgreSQL will be added advantage.
JavaScript, jQuery, Angular or node, HTML, CSS, API consumption, DB knowledge, building responsive web apps.
Should have excellent problem solving and programming skills in Python/Java.
Strong interpersonal, communication and analytical skills
Should have the ability to express their design ideas and thoughts.
Should have the zeal and adaptability to learn new technology frameworks.
Should have passed out in 2020 or is passing out in 2020 and have consistently scored above 75% or CGPA of 8.


![[x]cube LABS](/_next/image?url=https%3A%2F%2Fcdnv2.cutshort.io%2Fcompany-static%2F639877aa0ad87e002533a1c5%2Fuser_uploaded_data%2Flogos%2Fx_whiteB_eeCk0gqs.png&w=256&q=75)