11+ Oracle iExpenses Jobs in India
Apply to 11+ Oracle iExpenses Jobs on CutShort.io. Find your next job, effortlessly. Browse Oracle iExpenses Jobs and apply today!
Oracle EBS (R12) covering Financial Modules: Payables, Receivables, Assets, GL, Inventory, Cash Management, Expenses, Project, Procurement, Contract and Revenue automation module
About MyOperator:
MyOperator, India's leading cloud communication platform, is expanding its reach, and we're looking for a driven individual to lead our channel sales efforts.
MyOperator empowers over 12,000 businesses globally with seamless cloud telephony and WhatsApp solutions, including IVR, call management, virtual numbers, and robust CRM integrations. We are ranked #1 in India's Call + WhatsApp Matrix, helping businesses streamline communication, boost sales, and enhance customer experience. Join our ambitious team and play a pivotal role in our continued success!
Are you a results-driven and organized professional looking to take ownership of operational excellence? We are seeking an Executive - Operations and Admin to lead and strengthen our backend operations across telecom, data center, and vendor management domains. You will play a key role in optimizing resources, improving vendor performance, and driving operational strategies to support business growth.
Key Responsibilities:
- Conduct end-to-end procurement management of telecom resources, ensuring cost efficiency and service quality through strategic vendor partnerships.
- Manage data center operations by coordinating resource procurement, monitoring performance, and recommending improvements for scalability and resilience.
- Develop and implement vendor management strategies, including vendor evaluation, negotiation, onboarding, and performance review frameworks.
- Oversee and enhance technical troubleshooting processes related to assets and server infrastructure, ensuring minimal downtime and proactive maintenance planning.
- Handle and analyze internal and external operational queries received via the ticketing system, identify recurring issues, and propose process improvements.
- Streamline day-to-day administrative operations, propose SOPs, and ensure adherence to organizational standards for operational excellence.
- Support management in operational planning, vendor audits, and cost optimization initiatives.
- Collaborate with cross-functional teams to forecast operational requirements and align resource planning with organizational goals.
- Collect, organize, and prepare data from various internal and external sources based on requirements shared by the Management. Support decision-making through accurate data reporting and preparing dashboards.
Preferred Skills:
- Strong vendor management and negotiation skills.
- Ability to design and optimize operational workflows.
- Analytical mindset with a focus on process improvement.
- Technical knowledge of telecom and data center infrastructure.
- Proficiency in ticketing and reporting tools.
Skills:
- Strong oral and written communication.
- Proficiency in MS Office (Excel, Word, PowerPoint).
- Working knowledge of Google Data Studio, AWS QuickSight, or similar tools.
- Analytical mindset with ability to handle operational data.
- Vendor management and resource planning skills.
- Excellent networking, interpersonal, and resource management skills.
Experience: 3-5 year in backend operations.
Location Requirement: Candidates must be from Delhi/NCR only.
Work Environment:
- Work-from-office only, based in Noida, Sector 2.
- Willing to work flexible hours and roster-based timings (including Sundays).
- 6-day workweek.
- Availability to travel to data centers in different cities of India as needed.
Benefits
- Growth Opportunity: Strong potential for elevation to higher responsibilities, subject to performance.
- Work Benefits: Laptop and mobile reimbursements.
Position: Senior Data Engineer
Overview:
We are seeking an experienced Senior Data Engineer to design, build, and optimize scalable data pipelines and infrastructure to support cross-functional teams and next-generation data initiatives. The ideal candidate is a hands-on data expert with strong technical proficiency in Big Data technologies and a passion for developing efficient, reliable, and future-ready data systems.
Reporting: Reports to the CEO or designated Lead as assigned by management.
Employment Type: Full-time, Permanent
Location: Remote (Pan India)
Shift Timings: 2:00 PM – 11:00 PM IST
Key Responsibilities:
- Design and develop scalable data pipeline architectures for data extraction, transformation, and loading (ETL) using modern Big Data frameworks.
- Identify and implement process improvements such as automation, optimization, and infrastructure re-design for scalability and performance.
- Collaborate closely with Engineering, Product, Data Science, and Design teams to resolve data-related challenges and meet infrastructure needs.
- Partner with machine learning and analytics experts to enhance system accuracy, functionality, and innovation.
- Maintain and extend robust data workflows and ensure consistent delivery across multiple products and systems.
Required Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or related field.
- 10+ years of hands-on experience in Data Engineering.
- 5+ years of recent experience with Apache Spark, with a strong grasp of distributed systems and Big Data fundamentals.
- Proficiency in Scala, Python, Java, or similar languages, with the ability to work across multiple programming environments.
- Strong SQL expertise and experience working with relational databases such as PostgreSQL or MySQL.
- Proven experience with Databricks and cloud-based data ecosystems.
- Familiarity with diverse data formats such as Delta Tables, Parquet, CSV, and JSON.
- Skilled in Linux environments and shell scripting for automation and system tasks.
- Experience working within Agile teams.
- Knowledge of Machine Learning concepts is an added advantage.
- Demonstrated ability to work independently and deliver efficient, stable, and reliable software solutions.
- Excellent communication and collaboration skills in English.
About the Organization:
We are a leading B2B data and intelligence platform specializing in high-accuracy contact and company data to empower revenue teams. Our technology combines human verification and automation to ensure exceptional data quality and scalability, helping businesses make informed, data-driven decisions.
What We Offer:
Our workplace embraces diversity, inclusion, and continuous learning. With a fast-paced and evolving environment, we provide opportunities for growth through competitive benefits including:
- Paid Holidays and Leaves
- Performance Bonuses and Incentives
- Comprehensive Medical Policy
- Company-Sponsored Training Programs
We are an Equal Opportunity Employer, committed to maintaining a workplace free from discrimination and harassment. All employment decisions are made based on merit, competence, and business needs.
Key Skills Required:
· You will be part of the DevOps engineering team, configuring project environments, troubleshooting integration issues in different systems also be involved in building new features for next generation of cloud recovery services and managed services.
· You will directly guide the technical strategy for our clients and build out a new capability within the company for DevOps to improve our business relevance for customers.
· You will be coordinating with Cloud and Data team for their requirements and verify the configurations required for each production server and come with Scalable solutions.
· You will be responsible to review infrastructure and configuration of micro services and packaging and deployment of application
To be the right fit, you'll need:
· Expert in Cloud Services like AWS.
· Experience in Terraform Scripting.
· Experience in container technology like Docker and orchestration like Kubernetes.
· Good knowledge of frameworks such as Jenkins, CI/CD pipeline, Bamboo Etc.
· Experience with various version control system like GIT, build tools (Mavan, ANT, Gradle ) and cloud automation tools (Chef, Puppet, Ansible)
at Persistent Systems
Responsibilities:
- Develop and maintain scalable web applications using the MERN stack.
- Design and implement responsive user interfaces with React.
- Build server-side logic using Node.js and Express.js.
- Manage and optimize databases using MongoDB.
- Integrate third-party services and APIs.
- Write clean, maintainable, and efficient code.
- Collaborate with front-end and back-end teams to ensure smooth development processes.
We're seeking a qualified Team Leader who is responsible for developing the sales team, coordinating sales operations and implementing sales techniques that allow the business to meet and surpass its sales targets consistently. The team leader is responsible for supervising, managing, and motivating team members daily. able to act proactively to ensure smooth team operations and effective collaboration.
Requirement & Skills
Proven work experience as a team leader or supervisor
Excellent communication and leadership skills
Organizational and time-management skills
Decision-making skills
Job Location: Work from home
Selection process:- HR round & Manager round
Qualification: Any Graduate/Post Graduate
Working days: 6 working days (Sundays off)
Shifts: 10:00am -7:00pm
Mandatory language: English
Laptop: candidates are to use their laptops.
Additional Compensation: If applicable, this will be decided based on your designation
- 3+ years of industry experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd and streaming databases like druid
- Strong industry expertise with containerization technologies including kubernetes, docker-compose
- 2+ years of industry in experience in developing scalable data ingestion processes and ETLs
- Experience with cloud platform services such as AWS, Azure or GCP especially with EKS, Managed Kafka
- Experience with scripting languages. Python experience highly desirable.
- 2+ Industry experience in python
- Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
- Demonstrated expertise of building cloud native applications
- Experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd
- Experience in API development using Swagger
- Strong expertise with containerization technologies including kubernetes, docker-compose
- Experience with cloud platform services such as AWS, Azure or GCP.
- Implementing automated testing platforms and unit tests
- Proficient understanding of code versioning tools, such as Git
- Familiarity with continuous integration, Jenkins
- Design and Implement Large scale data processing pipelines using Kafka, Fluentd and Druid
- Assist in dev ops operations
- Develop data ingestion processes and ETLs
- Design and Implement APIs
- Assist in dev ops operations
- Identify performance bottlenecks and bugs, and devise solutions to these problems
- Help maintain code quality, organization, and documentation
- Communicate with stakeholders regarding various aspects of solution.
- Mentor team members on best practices
Responsibilities:
- Write effective, scalable code
- Develop Connectors tool GUI for various source and target systems/platform/products
- Test, debug, build and deploy UI components in the existing product
- Improve functionality of existing SDK/Frameworks
- JavaScript
- ReactJS
- Redux-Saga
- ES6 (or TypeScript) syntax
- Good problem-solving
- Good debugging
Good to Have:
- TypeScript experience
- Material UI experience
- JEST knowledge
- Git and Github/Gitlab/Bitbucket experience
Job Responsibility:
- Experience in building both front-end and back-end of the website
- Knowledge of WordPress Plugins
- Excellent and written and verbal communication skills with proven fluency in English
- Good understanding of Web architecture
- Analysing website performance and troubleshooting errors
- Conducts Wordpress design/development research and performs discovery analyses for new client projects, including content reviews and wireframing.
- Creates Wordpress design/development strategies based on research.
- Stay up-to-date with new web technologies.
Company Website: https://theempiremedia.in/
Commercial / residential Building experience is a must.
Industrial building exposure will not be considered relevant experience.
Material Planning
Monitoring progress of project assigned
Coordination with Project Team, Consultants, Quantity Billing





