Cutshort logo

11+ Ariba Jobs in India

Apply to 11+ Ariba Jobs on CutShort.io. Find your next job, effortlessly. Browse Ariba Jobs and apply today!

icon
codersbrain

at codersbrain

1 recruiter
Aishwarya Hire
Posted by Aishwarya Hire
Remote only
5 - 12 yrs
₹10L - ₹15L / yr
SAP
CEE
Ariba

1) Should have strong knowledge on Procure to Pay processes End to End


2) Purchase Requisitions creation, Processing, Approval workflow processes


3) Purchase Order creation, Processing, Approval workflow processes


4) Sending the PO to vendors


5) Order confirmation by vendors


6) PO approvals by the buyer considering vendor's order confirmation


7) Goods Receipt/Service entry sheet by the buyer


8) Two-way and three-way match


9) Invoice from vendor


10) Experience in configuring CIG


11) Invoice approvals workflow


12) Payment process (Desirable)


13) IT Consulting experience in any procurement applications like Ariba, SAP ECC, etc is needed


14) Working knowledge of SAP ECC procurement apps is an advantage


15) Ability to understand and mapping the procurement process to L1/L2/L3/L4/L5 levels based on multiple country/category of business flavours


16) Ability to map the processes to the tools(Ariba) functionality and conduct requirement elicitation workshops and do solution confirmation based on the available tool's functionality vs the business requirement


17) Strong communication skill is a must-have


18) Ability to manage/Govern Geo and function-wise distributed teams and ensure program delivery is a must.


19) Previous experience of managing large programs is needed


20) Project planning and delivery management is a must-have skill


Read more
AI-Powered Platform

AI-Powered Platform

Agency job
via Peak Hire Solutions by Dharati Thakkar
Remote only
5 - 10 yrs
₹35L - ₹45L / yr
skill iconMachine Learning (ML)
skill iconPython
Artificial Intelligence (AI)
Natural Language Processing (NLP)
Scikit-Learn
+10 more

Budget: 35 LPA to 45 LPA

Work schedule is Mon to Fri, 3:30am to 12:30pm IST


Key Responsibilities:

  • Design, develop, and deploy computer vision and machine learning models for analyzing visual and document-based data.
  • Build pipelines that convert unstructured visual inputs into structured and usable information.
  • Develop and evaluate models for tasks such as object detection, segmentation, document parsing, and image understanding.
  • Apply OCR and related techniques to extract meaningful information from complex documents and imagery.
  • Work with large datasets and build efficient training and evaluation pipelines.
  • Handle real-world visual datasets that may contain noise, inconsistencies, incomplete information, or varying formats.
  • Experiment with different approaches to solve challenging computer vision problems and evaluate tradeoffs between accuracy, performance, and complexity.
  • Collaborate with product and engineering teams to integrate machine learning models into scalable production systems.
  • Continuously improve model performance, accuracy, and robustness in real-world environments.
  • Stay up to date with the latest developments in AI and computer vision and apply relevant techniques where appropriate.
  • Actively leverage modern AI tools and frameworks to accelerate experimentation, development, and engineering workflows.


Requirements:

  • 5+ years of hands-on experience building and deploying machine learning models, particularly in Computer Vision or document understanding.
  • Strong proficiency in Python for machine learning and data processing.
  • Hands-on experience with modern ML frameworks such as PyTorch and libraries in the Hugging Face ecosystem.
  • Experience with computer vision tooling such as OpenCV.
  • Experience with common ML and data science libraries such as scikit-learn, NumPy, and Pandas.
  • Experience developing models for tasks such as segmentation, object detection, or document analysis.
  • Experience working with large image datasets and building training pipelines.
  • Solid understanding of model evaluation, data preprocessing, and performance optimization.
  • Strong problem-solving skills and ability to work in a fast-paced product environment.
  • Ability to collaborate effectively with cross-functional engineering and product teams.
  • The candidate should be based in India
  • Willing to work remotely full-time
  • Work schedule is Mon to Fri, 3:30am to 12:30pm IST


Preferred Qualifications:

  • Experience with TensorFlow or other deep learning frameworks.
  • Experience working with OCR pipelines or document analysis systems.
  • Experience deploying machine learning models in production environments.
  • Experience with containerized deployments such as Docker or Kubernetes.
  • Experience working with complex technical documents, diagrams, or structured visual data.
  • Familiarity with spatial or geometry-related data problems.
  • Experience with libraries such as Detectron2, MMDetection, or similar.
  • Familiarity with frameworks used to integrate modern AI models into applications (e.g., LangChain or similar tooling).
  • Contributions to open-source ML or computer vision projects are a plus.


Additional Information:

  • The problems we work on involve complex visual and document-based data, so we value engineers who enjoy tackling challenging technical problems and experimenting with different approaches to reach practical solutions.
  • Candidates are required to include links to relevant projects, GitHub repositories, research work, or examples of machine learning systems they have built.


Benefits:

  • Flexible remote work opportunities with career development opportunities
  • Engagement with a supportive and collaborative global team
  • Competitive market based salary
Read more
Koantek
Bhoomika Varshney
Posted by Bhoomika Varshney
Remote only
4 - 8 yrs
₹10L - ₹30L / yr
skill iconPython
databricks
SQL
Spark
PySpark
+3 more

The Sr AWS/Azure/GCP Databricks Data Engineer at Koantek will use comprehensive

modern data engineering techniques and methods with Advanced Analytics to support

business decisions for our clients. Your goal is to support the use of data-driven insights

to help our clients achieve business outcomes and objectives. You can collect, aggregate, and analyze structured/unstructured data from multiple internal and external sources and

patterns, insights, and trends to decision-makers. You will help design and build data

pipelines, data streams, reporting tools, information dashboards, data service APIs, data

generators, and other end-user information portals and insight tools. You will be a critical

part of the data supply chain, ensuring that stakeholders can access and manipulate data

for routine and ad hoc analysis to drive business outcomes using Advanced Analytics. You are expected to function as a productive member of a team, working and

communicating proactively with engineering peers, technical lead, project managers, product owners, and resource managers. Requirements:

 Strong experience as an AWS/Azure/GCP Data Engineer and must have

AWS/Azure/GCP Databricks experience.  Expert proficiency in Spark Scala, Python, and spark

 Must have data migration experience from on-prem to cloud

 Hands-on experience in Kinesis to process & analyze Stream Data, Event/IoT Hubs, and Cosmos

 In depth understanding of Azure/AWS/GCP cloud and Data lake and Analytics

solutions on Azure.  Expert level hands-on development Design and Develop applications on Databricks.  Extensive hands-on experience implementing data migration and data processing

using AWS/Azure/GCP services

 In depth understanding of Spark Architecture including Spark Streaming, Spark Core, Spark SQL, Data Frames, RDD caching, Spark MLib

 Hands-on experience with the Technology stack available in the industry for data

management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc

 Hands-on knowledge of data frameworks, data lakes and open-source projects such

asApache Spark, MLflow, and Delta Lake

 Good working knowledge of code versioning tools [such as Git, Bitbucket or SVN]

 Hands-on experience in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair

 Experience preparing data for Data Science and Machine Learning with exposure to- model selection, model lifecycle, hyperparameter tuning, model serving, deep

learning, etc

 Demonstrated experience preparing data, automating and building data pipelines for

AI Use Cases (text, voice, image, IoT data etc. ).  Good to have programming language experience with. NET or Spark/Scala

 Experience in creating tables, partitioning, bucketing, loading and aggregating data

using Spark Scala, Spark SQL/PySpark

 Knowledge of AWS/Azure/GCP DevOps processes like CI/CD as well as Agile tools

and processes including Git, Jenkins, Jira, and Confluence

 Working experience with Visual Studio, PowerShell Scripting, and ARM templates.  Able to build ingestion to ADLS and enable BI layer for Analytics

 Strong understanding of Data Modeling and defining conceptual logical and physical

data models.  Big Data/analytics/information analysis/database management in the cloud

 IoT/event-driven/microservices in the cloud- Experience with private and public cloud

architectures, pros/cons, and migration considerations.  Ability to remain up to date with industry standards and technological advancements

that will enhance data quality and reliability to advance strategic initiatives


 Working knowledge of RESTful APIs, OAuth2 authorization framework and security

best practices for API Gateways

 Guide customers in transforming big data projects, including development and

deployment of big data and AI applications

 Guide customers on Data engineering best practices, provide proof of concept, architect solutions and collaborate when needed

 2+ years of hands-on experience designing and implementing multi-tenant solutions


using AWS/Azure/GCP Databricks for data governance, data pipelines for near real-

time data warehouse, and machine learning solutions.  Over all 5+ years' experience in a software development, data engineering, or data


analytics field using Python, PySpark, Scala, Spark, Java, or equivalent technologies.  hands-on expertise in Apache SparkTM (Scala or Python)

 3+ years of experience working in query tuning, performance tuning, troubleshooting, and debugging Spark and other big data solutions.  Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience

 Ability to manage competing priorities in a fast-paced environment

 Ability to resolve issues

 Basic experience with or knowledge of agile methodologies

 AWS Certified: Solutions Architect Professional

 Databricks Certified Associate Developer for Apache Spark

 Microsoft Certified: Azure Data Engineer Associate

 GCP Certified: Professional Google Cloud Certified

Read more
Netwalk
Thiruvananthapuram
4 - 10 yrs
₹15L - ₹28L / yr
Embedded C++
skill iconC++
Embedded software
Object Oriented Programming (OOPs)
Object Oriented Analysis

Responsibilities:

  • Software Development with C++ for Autonomous drive project.
  • QT Library (no GUI features)
  • Object Oriented Analysis / Object Oriented Design
  • C++ Template implementation
  • C++17 specifics like “std::optional”
  • Macro implementation
  • Implementation of Clean Code
  • Static Code Analysis
  • CMake


Qualifications:

  •  Excellent GIT knowledge especially how to merge, Rebase
  •  University degree in Electrical/Electronic engineering, Computer Science or similar
  • Minimum 1 to 5 years of embedded software development experience on Yocto Linux based projects in automotive domain
  • Expert in C++ programming
  • Strong debugging skills
  • Good communication skills

 

Read more
Growtomation Marketing Solutions
Gurugram
2 - 5 yrs
₹10L - ₹25L / yr
skill iconJavascript
TypeScript
AWS
AZURE
skill iconGitHub
+2 more

Responsibilities:

  • Design, develop, and maintain responsive web applications using Node.js, Next.js, and React.js.
  • Implement robust APIs and services using Node.js.
  • Ensure the technical feasibility of UI/UX designs and optimize applications for maximum speed and scalability.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Integrate data from various back-end services and databases.
  • Manage and deploy applications on cloud platforms such as AWS, GCP, or Azure.
  • Utilize version control tools such as Git to manage codebase changes.
  • Implement continuous integration and continuous deployment (CI/CD) pipelines to streamline development and deployment processes.
  • Stay updated with emerging trends and technologies in the software development industry.

Qualifications:

  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • 2-4 years of proven experience as a Full Stack developer.
  • Strong proficiency in JavaScript/Typescript.
  • Experience with cloud services (AWS, GCP, or Azure) and managing scalable applications in the cloud.
  • Solid understanding of version control tools, preferably Git.
  • Knowledge of CI/CD tools and methodologies.
  • Excellent problem-solving skills and critical thinking abilities.
  • Strong communication and teamwork skills.
  • Ability to handle multiple projects and meet deadlines


Read more
SkillDeck

at SkillDeck

1 recruiter
Namrata Choudhary
Posted by Namrata Choudhary
Remote only
1 - 7 yrs
₹3L - ₹6L / yr
Sales
Sales management
Business Development
Communication Skills
Effective communication
+1 more
As a Corporate Sales Manager, you will be responsible for developing and implementing strategies to drive revenue growth and achieve targets within the corporate sector. In this role, you will work closely with the corporate sales team to identify and pursue new business opportunities, create and maintain strong relationships with key decision-makers in the corporate sector, and effectively negotiate and close deals.

To be successful in this role, you should have a proven track record in sales, excellent communication and negotiation skills, and the ability to work in a fast-paced environment. You should also be highly organized and able to manage to close deals with corporate.

Additional responsibilities may include:
-Setting and achieving sales targets
-Developing and implementing sales plans and programs
-Analyzing market trends and identifying new business opportunities
-Targeting and implementing new client base along with corporates and colleges
-Training and coaching team members to ensure their success

If you are a results-driven individual with a passion for sales and a desire to succeed, we encourage you to apply for this exciting opportunity.
Read more
Number Theory

at Number Theory

3 recruiters
Nidhi Mishra
Posted by Nidhi Mishra
Gurugram
5 - 8 yrs
₹15L - ₹15L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)

Experience:

 

The candidate should have about 5+ years of experience with design and development in Java/Scala. Experience in algorithm, data-structure, database and architectures of distributed System is mandatory.

 

Required Skills:

  1. In-depth knowledge of Hadoop, Spark architecture and its components such as HDFS, YARN and executor, cores and memory param
  2. Knowledge of Scala and Java both
  3. Extensive experience in developing spark job. Should possess good Oops knowledge and be aware of enterprise application design patterns.
  4. Good knowledge of Unix/Linux.
  5. Experience working on large-scale software projects
  6. Understanding the big picture and the various uses cases involved while crafting the solution and documenting them in Unified Modeling language.
  7. Own and maintain the architecture document.
  8. Keep an eye out for technological trends, open-source projects that can be used.
  9. Knows common programming languages and Frameworks.
  10. Real time streaming data consumption

Good to have :

  1. Azure/AWS Cloud Knowledge of Data Storage and Compute side
  2. Knowledge Multitenant Architecture
  3. Brief idea of Data Science

 

Read more
A Reputed Japanese MNC

A Reputed Japanese MNC

Agency job
via Selective Global Search by Nansi Garg
Noida
4 - 7 yrs
₹5L - ₹14L / yr
skill iconReact.js
User Interface (UI) Design
User Experience (UX) Design
Data security
Azure Devops

Mandatory:

  • 4-7 Experience in React JS,  (ReactJS 2.5yrs compulsory)
  • Optional:

Knowledge in UX/UI design, Azure DevOps, Test Driven - and Domain Driven Development

  • Developing of complex IT systems with various system integrations and configurations
  • Data security/GDPR
Read more
Symansys Technologies India Pvt Ltd
Tanu Chauhan
Posted by Tanu Chauhan
Pune, Mumbai
2 - 8 yrs
₹5L - ₹15L / yr
skill iconData Science
skill iconMachine Learning (ML)
skill iconPython
Tableau
skill iconR Programming
+1 more

Specialism- Advance Analytics, Data Science, regression, forecasting, analytics, SQL, R, python, decision tree, random forest, SAS, clustering classification

Senior Analytics Consultant- Responsibilities

  • Understand business problem and requirements by building domain knowledge and translate to data science problem
  • Conceptualize and design cutting edge data science solution to solve the data science problem, apply design thinking concepts
  • Identify the right algorithms , tech stack , sample outputs required to efficiently adder the end need
  • Prototype and experiment the solution to successfully demonstrate the value
    Independently or with support from team execute the conceptualized solution as per plan by following project management guidelines
  • Present the results to internal and client stakeholder in an easy to understand manner with great story telling, story boarding, insights and visualization
  • Help build overall data science capability for eClerx through support in pilots, pre sales pitches, product development , practice development initiatives
Read more
Envigo Marketing Pvt. Ltd.
Dheeraj Nagar
Posted by Dheeraj Nagar
Remote, Gurgaon
3 - 6 yrs
₹5L - ₹7L / yr
Magento
skill iconJavascript
WooCommerce
skill iconPHP
MySQL
+1 more
  1. Develop and manage e-commerce websites, web applications & web sites.
  2. Analyze, design, code, debug, test, document & deploy applications.
  3. Participate in project & deployment planning.
  4. Must be a self-starter & be able to work with minimum supervision
  5. Exp. In modules/extensions development/customization.
  6. Exp. In Theme integration/customization.
  7. Exp. In API creation/integration.
  8. Exp. In Migration from Magento1 to Magento2
  9. Extensive experience of PHP and MySQL.
  10. Exposure on Magento 2, CMS and JavaScript frameworks such as jQuery.
  11. Demonstrable knowledge of XML, XHTML, CSS, Modules i.e. API integration,
  12. Payment Gateways, XML with a focus on standards.
  13. Demonstrable source control experience
  14. Two or more published websites in E-Commerce
Read more
ZipGrid - MyAashiana Management Services
Kunal Gupta
Posted by Kunal Gupta
Mumbai
3 - 7 yrs
₹5L - ₹10L / yr
Sales to communities
Multi-tenanted sales
Dealing with a multitude decision makers
Looking for a sales manager for a community knowledge services provider. Previous work experience in roles such TASC (for banks) preferred.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort