Key Responsibilities & Job Duties
● To provide functional support for our ERP
Finance applications (Oracle eBusiness Suite)
○ Assist users in case of question
○ Manage functional requests
● Train users to keep the right level of skill
○ To organize audits of the use of systems
○ Detect and collect issues
○ Implement action plan and coordinate actions
● To manage enhancements in accordance with the procedures which apply in the department
○ Collect enhancements from users
○ Challenge the user requirements
● Define priorities
○ Write functional specifications and coordinate developments with the technical team
● Test and train the users
○ Regularly communicate to the users the status of the enhancement
○ Report activity to the management
○ Collaborate with functional support in other zones to share information
○ Optionally, may manage some projects
Key Technical & Functional Competencies
● Functional competencies
○ Strong Business processes: Finance
● Technical competencies
○ Good Knowledge of Oracle SQL
○ Strong Oracle eBusiness Suite
Modules: AP, AR, GL, FA (At least at a professional for two Modules)
○ Basic knowledge of Oracle eBusiness Suite Modules: INV, PO, OE, WSH is preferable
About Marktine
Similar jobs
Job Title: Data Engineer
Cargill’s size and scale allows us to make a positive impact in the world. Our purpose is to nourish the world in a safe, responsible and sustainable way. We are a family company providing food, ingredients, agricultural solutions and industrial products that are vital for living. We connect farmers with markets so they can prosper. We connect customers with ingredients so they can make meals people love. And we connect families with daily essentials — from eggs to edible oils, salt to skincare, feed to alternative fuel. Our 160,000 colleagues, operating in 70 countries, make essential products that touch billions of lives each day. Join us and reach your higher purpose at Cargill.
Job Purpose and Impact
As a Data Engineer at Cargill you work across the full stack to design, develop and operate high performance and data centric solutions using our comprehensive and modern data capabilities and platforms. You will play a critical role in enabling analytical insights and process efficiencies for Cargill’s diverse and complex business environments. You will work in a small team that shares your passion for building innovative, resilient, and high-quality solutions while sharing, learning and growing together.
Key Accountabilities
Collaborate with business stakeholders, product owners and across your team on product or solution designs.
· Develop robust, scalable and sustainable data products or solutions utilizing cloud-based technologies.
· Provide moderately complex technical support through all phases of product or solution life cycle.
· Perform data analysis, handle data modeling, and configure and develop data pipelines to move and optimize data assets.
· Build moderately complex prototypes to test new concepts and provide ideas on reusable frameworks, components and data products or solutions and help promote adoption of new technologies.
· Independently solve moderately complex issues with minimal supervision, while escalating more complex issues to appropriate staff.
· Other duties as assigned
Qualifications
MINIMUM QUALIFICATIONS
· Bachelor’s degree in a related field or equivalent experience
· Minimum of two years of related work experience
· Other minimum qualifications may apply
PREFERRED QUALIFCATIONS
· Experience developing modern data architectures, including data warehouses, data lakes, data meshes, hubs and associated capabilities including ingestion, governance, modeling, observability and more.
· Experience with data collection and ingestion capabilities, including AWS Glue, Kafka Connect and others.
· Experience with data storage and management of large, heterogenous datasets, including formats, structures, and cataloging with such tools as Iceberg, Parquet, Avro, ORC, S3, HFDS, HIVE, Kudu or others.
· Experience with transformation and modeling tools, including SQL based transformation frameworks, orchestration and quality frameworks including dbt, Apache Nifi, Talend, AWS Glue, Airflow, Dagster, Great Expectations, Oozie and others
· Experience working in Big Data environments including tools such as Hadoop and Spark
· Experience working in Cloud Platforms including AWS, GCP or Azure
· Experience of streaming and stream integration or middleware platforms, tools, and architectures such as Kafka, Flink, JMS, or Kinesis.
· Strong programming knowledge of SQL, Python, R, Java, Scala or equivalent
· Proficiency in engineering tooling including docker, git, and container orchestration services
· Strong experience of working in devops models with demonstratable understanding of associated best practices for code management, continuous integration, and deployment strategies.
· Experience and knowledge of data governance considerations including quality, privacy, security associated implications for data product development and consumption.
Equal Opportunity Employer, including Disability/Vet.
- Proficiency in Oops Concepts and C# (Professional)
- ASP.NET MVC | ASP.NET Core | EF Core | MVC applications (Advanced)
- Entity Framework | ADO.NET (Moderate)
- Proficient knowledge in MSSQL | PostgreSQL | Oracle | LINQ | Couch base (Moderate)
- Database design including indexes and data integrity
- Dependency injection , IoC containers using: Autofac , Handlers,
- Understanding of HTML, JS, and CSS
- Client side framework jQuery | Typescript| Angular 3.*+ (Moderate)
- Knowledge in micro services architecture using service brokers would be an added advantage
- Knowledge of cybersecurity aspects on application development would be added advantage
- Should be able to identify application performance bottleneck and solve / recommend solutions
- Develop, perform unit testing
- Working knowledge on source control GIT Lab
- Project management using JIRA
DYT - Do Your Thng, is an app, where all social media users can share brands they love with their followers and earn money while doing so! We believe everyone is an influencer. Our aim is to democratise social media and allow people to be rewarded for the content they post. How does DYT help you? It accelerates your career through collaboration opportunities with top brands and gives you access to a community full of experts in the influencer space.
Min 3-4 years of experience in a sales organisation
Experience 5+ yrs
Relevant to JDA EXP 1+ yrs
• Developing games in the Unity engine
• Implementing game functionality as per communicated design
• Coordinating with the team
• Developing & maintaining efficient, reusable, and reliable code for games
• Playing, researching, & ideating for the new games
Requirements :
• 1 to 3 years of game development experience.
• Strong understanding of object-oriented programming, Familiarity with current design and architectural patterns.
• Excellent knowledge of Unity, including scripting, textures, animation and GUI styles. Excellent Knowledge of Physics for 3D game play elements.
• Experience with Game Physics and Particle systems.
• Hands on experience in integrating multiple 3rd party plugins in unity/android ios.
• Good capacity for teamwork & strong ability to resolve problems and conflicts
• Optimization, profiling, debugging
Tiger Analytics is a global AI & analytics consulting firm. With data and technology at the core of our solutions, we are solving some of the toughest problems out there. Our culture is modeled around expertise and mutual respect with a team first mindset. Working at Tiger, you’ll be at the heart of this AI revolution. You’ll work with teams that push the boundaries of what-is-possible and build solutions that energize and inspire.
We are headquartered in the Silicon Valley and have our delivery centres across the globe. The below role is for our Chennai or Bangalore office, or you can choose to work remotely.
About the Role:
As an Associate Director - Data Science at Tiger Analytics, you will lead data science aspects of endto-end client AI & analytics programs. Your role will be a combination of hands-on contribution, technical team management, and client interaction.
• Work closely with internal teams and client stakeholders to design analytical approaches to
solve business problems
• Develop and enhance a broad range of cutting-edge data analytics and machine learning
problems across a variety of industries.
• Work on various aspects of the ML ecosystem – model building, ML pipelines, logging &
versioning, documentation, scaling, deployment, monitoring and maintenance etc.
• Lead a team of data scientists and engineers to embed AI and analytics into the client
business decision processes.
Desired Skills:
• High level of proficiency in a structured programming language, e.g. Python, R.
• Experience designing data science solutions to business problems
• Deep understanding of ML algorithms for common use cases in both structured and
unstructured data ecosystems.
• Comfortable with large scale data processing and distributed computing
• Excellent written and verbal communication skills
• 10+ years exp of which 8 years of relevant data science experience including hands-on
programming.
Designation will be commensurate with expertise/experience. Compensation packages among the best in the industry.
PS: Please apply only if you can work from our office in Hanumanthanagar, Bengaluru for 2 weeks. Location is around 3.5 kms from Lalbagh
If you can watch, play and write well about Android games, this job is for you.
Responsibilities and Duties
You'd be responsible to:
- conceptualize new content ideas,
- research on the assigned topics,
- and create high-quality content around mobile games.
Qualifications and Skills
- Impeccable written and verbal command over English is a must.
- Graduate/post-graduate in English literature or Technology would be a plus.
- Must NOT be over 28 years old in age.