Responsibilities
• Should be able to understand functional & technical requirements from business and come up with solutions.
• Development, Support, Debugging and troubleshooting of the existing developed applications (Forms, Reports, Interfaces etc.).
• Should be able to configure Oracle EBS/Cloud including the configuration of Business Processes and security.
• Integrations of customized applications with Oracle standard applications modules.
• Should be very strong in SQL and strong in PLSQL (especially Packages, Procedures and Functions).
• Database structure design for Tables, views, sequences, synonyms, and database triggers etc.
• Training users/BSAs when required & preparing technical documentation.
• Ensure customer satisfaction, Self-drive and perform duties with minimal supervision.
• Ensure tickets are resolved within SLA’s and assignments are completed within the agreed deadlines
• Good enough to understand the existing PLSQL packages/procedures/functions and modified / create new ones as needed.
• Ability and willingness to stay updated with the latest developments and updates in Oracle Fusion technologies and best practices.
Requirement:
• 4+ years of Technical or Techno-Functional experience in Oracle EBS R12/Cloud Applications on Oracle HR/HCM (Any Modules in HR), SQL/PLSQL.
• Good experience with HR/HCM (Any Modules in HR) is a must and should understand the functionality.
• Should be able to understand (Any Modules in HR) data model and should be quick enough to switch from one product to another product easily.
• Ability to develop custom solutions, extensions, and integrations to meet specific business requirements.
• Proficiency in creating custom reports, dashboards, and analytics using Oracle BI Publisher, Oracle OTBI (Oracle Transactional Business Intelligence), and other reporting tools.
• Knowledge of security models, user roles, and access controls within Oracle Fusion applications to ensure data integrity and compliance.
• Strong analytical and problem-solving skills to identify and address technical issues, system errors, and integration challenges.
• Good to have experience in Performance tuning techniques.
• Effective communication skills at all organizational levels, including written, oral, and presentation skills.
• Should be a good team with can do attitude.
• Capability to switch and willingness to work in any of the technologies as per the need basis.
• Should have a good understanding in working with onsite-offshore model.
About Splash BI
Similar jobs
Experience to enhance our cloud infrastructure and optimize application performance.
- Bachelor’s degree in Computer Science or related field.
- 5+ years of DevOps experience with strong scripting skills (shell, Python, Ruby).
- Familiarity with open-source technologies and application development methodologies.
- Experience in optimizing both stand-alone and distributed systems.
Key Responsiblities
- Design and maintain DevOps practices for seamless application deployment.
- Utilize AWS tools (EBS, S3, EC2) and automation technologies (Ansible, Terraform).
- Manage Docker containers and Kubernetes environments.
- Implement CI/CD pipelines with tools like Jenkins and GitLab.
- Use monitoring tools (Datadog, Prometheus) for system reliability.
- Collaborate effectively across teams and articulate technical choices.
companies uncover the 3% of active buyers in their target market. It evaluates
over 100 billion data points and analyzes factors such as buyer journeys, technology
adoption patterns, and other digital footprints to deliver market & sales intelligence.
Its customers have access to the buying patterns and contact information of
more than 17 million companies and 70 million decision makers across the world.
Role – Data Engineer
Responsibilities
Work in collaboration with the application team and integration team to
design, create, and maintain optimal data pipeline architecture and data
structures for Data Lake/Data Warehouse.
Work with stakeholders including the Sales, Product, and Customer Support
teams to assist with data-related technical issues and support their data
analytics needs.
Assemble large, complex data sets from third-party vendors to meet business
requirements.
Identify, design, and implement internal process improvements: automating
manual processes, optimizing data delivery, re-designing infrastructure for
greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and
loading of data from a wide variety of data sources using SQL, Elasticsearch,
MongoDB, and AWS technology.
Streamline existing and introduce enhanced reporting and analysis solutions
that leverage complex data sources derived from multiple internal systems.
Requirements
5+ years of experience in a Data Engineer role.
Proficiency in Linux.
Must have SQL knowledge and experience working with relational databases,
query authoring (SQL) as well as familiarity with databases including Mysql,
Mongo, Cassandra, and Athena.
Must have experience with Python/Scala.
Must have experience with Big Data technologies like Apache Spark.
Must have experience with Apache Airflow.
Experience with data pipeline and ETL tools like AWS Glue.
Experience working with AWS cloud services: EC2, S3, RDS, Redshift.
This role is for Work from the office.
Job Description
Roles & Responsibilities
- Work across the entire landscape that spans network, compute, storage, databases, applications, and business domain
- Use the Big Data and AI-driven features of vuSmartMaps to provide solutions that will enable customers to improve the end-user experience for their applications
- Create detailed designs, solutions and validate with internal engineering and customer teams, and establish a good network of relationships with customers and experts
- Understand the application architecture and transaction-level workflow to identify touchpoints and metrics to be monitored and analyzed
- Analytics and analysis of data and provide insights and recommendations
- Constantly stay ahead in communicating with customers. Manage planning and execution of platform implementation at customer sites.
- Work with the product team in developing new features, identifying solution gaps, etc.
- Interest and aptitude in learning new technologies - Big Data, no SQL databases, Elastic Search, Mongo DB, DevOps.
Skills & Experience
- At least 2+ years of experience in IT Infrastructure Management
- Experience in working with large-scale IT infra, including applications, databases, and networks.
- Experience in working with monitoring tools, automation tools
- Hands-on experience in Linux and scripting.
- Knowledge/Experience in the following technologies will be an added plus: ElasticSearch, Kafka, Docker Containers, MongoDB, Big Data, SQL databases, ELK stack, REST APIs, web services, and JMX.
Tableau Developer |
Tableau Prep-MUST+Tableau+SQL |
Must have expereince in Tableau Prep
miimum 4 years of expereince
should have expereince in SQL
Must have sound experience & knowledge of PHP & MySQL
Develop Magento modules in PHP using best practices.
Should possess Advanced knowledge of Magento, JavaScript, HTML, PHP, and MySQL
The Magento Developer position is a mid-senior level position that will take a key role in Magento extension/theme development.
Job Role: Magento Developer
Location: Remote
Responsibilities
• Write and review extensible code
• Design, test and deploy Magento applications
• Follow industry-standard best practices
• Assume responsibility for security of code and data
• Contribute to continual enhancement of product
Requirements
• 6+ years of software engineering experience in PHP language •
2+ years of hands-on experience in Magento / Magento 2 (not just themes, but module development, advanced level customizations, and performance tuning)
• Understanding of Magento's code structure, extension architecture, theme hierarchy, and fallback components
• Experience installing and configuring third-party extensions on Magento 2
• Magento 2 Ecommerce Enterprise edition preferred
• Familiarity with Magento security practices
• Ability to analyze and optimize UI and back-end application code for efficiency and performance
• Understanding of version control management, especially git
Location: Bangalore
Purpose: The person in this position would be responsible for backend integration of Deep learning algorithms, creating dashboards for clients.
Roles & Responsibilities:
Demonstrates a growth mindset, seeks feedback often and is effective in continuous personal and professional development Provides expertise in all phases of the development lifecycle from concept and design to testing Defines the architecture, best practices and coding standards for the product development team Supports continuous technical improvement by investigating alternatives and technologies and presenting these for architectural review Motivates team members and extends goodwill to other employees while having fun!
Job Requirements:
2+ years of software industry experience Strong Expertise in JS, PHP, React, Node, Angular2+, MySQL, PostgreSQL Solid understanding of software design, development, testing, and problem-solving Expertise in coding efficient, high quality and modularized software Experience in developing Web services - Rest/Soap APIs/HTTP API - Microservices Experience setting up and managing servers. Devops experience is a big plus Strong exposure on Database like RDBMS - Postgres DB / NoSQL DB like DynamoDB, Elasticsearch Experience in Cloud / Storage like Amazon (AWS) - EC2/EBS/S3 Expertise in test automation
Familiarity with Unix shell and source control systems and tools such as git Strong technical leadership skills Comfortable collaborating with designers, front-end developers and other team members Strong communication skills Technical Coaching and mentoring skills Understanding of machine learning, natural language processing is a plus.