Cargill Business Services
http://cargill.comJobs at Cargill Business Services
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Purpose and Impact
The Engineer Consultant, will join the Global Supply Chain team within the Digital Technology and Data function (also known as IT). As a Software Engineer you will provide best practice solutions to architect, design and develop new and existing digital solutions for the organization's digital portfolio. In this role, you will discover and deliver innovative solutions to complex and varied problems to enable the company's digital future. You will bring clarity to ambiguous scenarios and apply specialized, in depth, and broad knowledge of architectural, engineering and security practices to ensure your solutions are scalable, resilient and robust and will share knowledge on modern best practices and technologies to the shared engineering community.
Key Accountabilities
- Apply innovative and advanced software engineering patterns and principles to design, develop, test, integrate, maintain and troubleshoot complex and varied software solutions and incorporate security practices in newly developed and maintained applications.
- Lead peer code review sessions to review code, coach peers and ensure code quality.
- Take the lead in the assigned agile team to adopt agile philosophies, facilitate agile ceremonies and identify continuous improvement opportunities.
- Establish and incorporate the company's engineering and development best practices within the full software development lifecycle including coding standards, code reviews, source control management, building processes, testing and security principles, to deliver high quality code rapidly.
- Lead demonstration and continuous feedback sessions to improve development and help drive the long term vision.
- Build innovation in the engineering community by maintaining and sharing relevant technical approaches and modern skills.
- Independently handle complex issues with minimal supervision, while escalating only the most complex issues to appropriate staff.
- Support critical supply chain applications on a rotational on-call basis as/if needed.
- Other duties as assigned.
Qualifications
Minimum Qualifications
- Bachelor's degree in a related field or equivalent experience.
- Minimum of four years of related work experience.
Preferred Qualifications
- Experience building Supply Chain applications and implementing Supply Chain software packages (SaaS/PaaS).
- Proven experience in automating pipelines for continuous integration, testing, delivery and security.
- Proven experience in architecting applications, databases, services or integrations.
- Experience with C3.ai and or/ o9 solutions.
- Proven ability to quickly learn new languages and platforms.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Purpose and Impact
The DevOps Engineer is a key position to strengthen the security automation capabilities which have been identified as a critical area for growth and specialization within Global IT’s scope. As part of the Cyber Intelligence Operation’s DevOps Team, you will be helping shape our automation efforts by building, maintaining and supporting our security infrastructure.
Key Accountabilities
- Collaborate with internal and external partners to understand and evaluate business requirements.
- Implement modern engineering practices to ensure product quality.
- Provide designs, prototypes and implementations incorporating software engineering best practices, tools and monitoring according to industry standards.
- Write well-designed, testable and efficient code using full-stack engineering capability.
- Integrate software components into a fully functional software system.
- Independently solve moderately complex issues with minimal supervision, while escalating more complex issues to appropriate staff.
- Proficiency in at least one configuration management or orchestration tool, such as Ansible.
- Experience with cloud monitoring and logging services.
Qualifications
Minimum Qualifications
- Bachelor's degree in a related field or equivalent exp
- Knowledge of public cloud services & application programming interfaces
- Working exp with continuous integration and delivery practices
Preferred Qualifications
- 3-5 years of relevant exp whether in IT, IS, or software development
- Exp in:
- Code repositories such as Git
- Scripting languages (Python & PowerShell)
- Using Windows, Linux, Unix, and mobile platforms within cloud services such as AWS
- Cloud infrastructure as a service (IaaS) / platform as a service (PaaS), microservices, Docker containers, Kubernetes, Terraform, Jenkins
- Databases such as Postgres, SQL, Elastic
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Purpose and Impact:
The Sr. Generative AI Engineer will architect, design and develop new and existing GenAI solutions for the organization. As a Generative AI Engineer, you will be responsible for developing and implementing products using cutting-edge generative AI and RAG to solve complex problems and drive innovation across our organization. You will work closely with data scientists, software engineers, and product managers to design, build, and deploy AI-powered solutions that enhance our products and services in Cargill. You will bring order to ambiguous scenarios and apply in depth and broad knowledge of architectural, engineering and security practices to ensure your solutions are scalable, resilient and robust and will share knowledge on modern practices and technologies to the shared engineering community.
Key Accountabilities:
• Apply software and AI engineering patterns and principles to design, develop, test, integrate, maintain and troubleshoot complex and varied Generative AI software solutions and incorporate security practices in newly developed and maintained applications.
• Collaborate with cross-functional teams to define AI project requirements and objectives, ensuring alignment with overall business goals.
• Conduct research to stay up-to-date with the latest advancements in generative AI, machine learning, and deep learning techniques and identify opportunities to integrate them into our products and services, optimizing existing generative AI models and RAG for improved performance, scalability, and efficiency, developing and maintaining pipelines and RAG solutions including data preprocessing, prompt engineering, benchmarking and fine-tuning.
• Develop clear and concise documentation, including technical specifications, user guides and presentations, to communicate complex AI concepts to both technical and non-technical stakeholders.
• Participate in the engineering community by maintaining and sharing relevant technical approaches and modern skills in AI.
• Contribute to the establishment of best practices and standards for generative AI development within the organization.
• Independently handle complex issues with minimal supervision, while escalating only the most complex issues to appropriate staff.
Minimum Qualifications:
• Bachelor’s degree in a related field or equivalent experience
• Minimum of five years of related work experience
• You are proficient in Python and have experience with machine learning libraries and frameworks
• Have deep understanding of industry leading Foundation Model capabilities and its application.
• You are familiar with cloud-based Generative AI platforms and services
• Full stack software engineering experience to build products using Foundation Models
• Confirmed experience architecting applications, databases, services or integrations.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Title: Data Engineer
Cargill’s size and scale allows us to make a positive impact in the world. Our purpose is to nourish the world in a safe, responsible and sustainable way. We are a family company providing food, ingredients, agricultural solutions and industrial products that are vital for living. We connect farmers with markets so they can prosper. We connect customers with ingredients so they can make meals people love. And we connect families with daily essentials — from eggs to edible oils, salt to skincare, feed to alternative fuel. Our 160,000 colleagues, operating in 70 countries, make essential products that touch billions of lives each day. Join us and reach your higher purpose at Cargill.
Job Purpose and Impact
As a Data Engineer at Cargill you work across the full stack to design, develop and operate high performance and data centric solutions using our comprehensive and modern data capabilities and platforms. You will play a critical role in enabling analytical insights and process efficiencies for Cargill’s diverse and complex business environments. You will work in a small team that shares your passion for building innovative, resilient, and high-quality solutions while sharing, learning and growing together.
Key Accountabilities
Collaborate with business stakeholders, product owners and across your team on product or solution designs.
· Develop robust, scalable and sustainable data products or solutions utilizing cloud-based technologies.
· Provide moderately complex technical support through all phases of product or solution life cycle.
· Perform data analysis, handle data modeling, and configure and develop data pipelines to move and optimize data assets.
· Build moderately complex prototypes to test new concepts and provide ideas on reusable frameworks, components and data products or solutions and help promote adoption of new technologies.
· Independently solve moderately complex issues with minimal supervision, while escalating more complex issues to appropriate staff.
· Other duties as assigned
Qualifications
MINIMUM QUALIFICATIONS
· Bachelor’s degree in a related field or equivalent experience
· Minimum of two years of related work experience
· Other minimum qualifications may apply
PREFERRED QUALIFCATIONS
· Experience developing modern data architectures, including data warehouses, data lakes, data meshes, hubs and associated capabilities including ingestion, governance, modeling, observability and more.
· Experience with data collection and ingestion capabilities, including AWS Glue, Kafka Connect and others.
· Experience with data storage and management of large, heterogenous datasets, including formats, structures, and cataloging with such tools as Iceberg, Parquet, Avro, ORC, S3, HFDS, HIVE, Kudu or others.
· Experience with transformation and modeling tools, including SQL based transformation frameworks, orchestration and quality frameworks including dbt, Apache Nifi, Talend, AWS Glue, Airflow, Dagster, Great Expectations, Oozie and others
· Experience working in Big Data environments including tools such as Hadoop and Spark
· Experience working in Cloud Platforms including AWS, GCP or Azure
· Experience of streaming and stream integration or middleware platforms, tools, and architectures such as Kafka, Flink, JMS, or Kinesis.
· Strong programming knowledge of SQL, Python, R, Java, Scala or equivalent
· Proficiency in engineering tooling including docker, git, and container orchestration services
· Strong experience of working in devops models with demonstratable understanding of associated best practices for code management, continuous integration, and deployment strategies.
· Experience and knowledge of data governance considerations including quality, privacy, security associated implications for data product development and consumption.
Equal Opportunity Employer, including Disability/Vet.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
50% Solution Definition, Confirmation, and Development
• In depth knowledge of SAP IBP solution development in multiple modules (DP/SP/IO) with an ability to support solution deployment considering best practices
• Ability to drive the business need discussions and mapping the requirements to SAP IBP system and design the process
• Key contact for the vendor partners of IBP application
• Support end-to-end application architecture integrity and the functional relationship across multiple projects and applications
• Identifies key integration requirements to other solutions, understands the end-to-end flow of data, and ensures the integrity of data as it moves across the solution landscape
• Develops conceptual designs and solution blueprints
• Provides support and consultation for detailed design, deployment, testing and production support.
• Responsible for maintaining solution integrity through the course of the project including controlling scope, managing solutions for change requests and clarifying solution capabilities during all delivery phases
• Acts as part of the broader Cargill Architecture community to define and execute architecture processes.
• Partners with the Center-Led Portfolio Solution Architect(s) to understand the application architecture for all capabilities within the OSC portfolio (meaning the application portfolio, key APIs, transactions, etc.) and the associated roadmap
• Ensures region application architecture aligns with portfolio architecture and Enterprise Architecture standards/principles
30% Business Partnership
• Work with regional BRMs and businesses to establish clear connections between business goals and strategies and the process, data, and technology investments required to achieve them.
• Collaborate with internal teams and external stakeholders to resolve issues, troubleshoot errors, and optimize performance
• Responsible for the definition of the architecture and technology opportunities of the application based on new and emerging technologies. Independently establish priorities and strategies, ensuring they are consistent with business goals and economic viability.
20% IT Innovation & Delivery
• Drives improvement of architecture methodologies and services. Supports implementation, improvements, and proper utilization of architecture tools. Ensures the quality of architecture assets, and keeps them up-to-date. Develops and maintains architecture metrics.
• Continuous Improvement: Stay abreast of SAP IBP enhancements, upgrades, and new features. Proactively propose and implement improvements to existing processes leveraging new functionalities.
• Maintain knowledge of industry trends and utilize this knowledge to educate both IT and the business on opportunities to build better target architectures that support and drive business decisions.
• Assist in the design of new testing methods and resolves routine and non-routine technical issues with minimal assistance. With minimal guidance, monitors systems capacity and performance.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Purpose and Impact:
The Enterprise Resource Planning (ERP) Engineering Supervisor will lead a small engineering team across technology and business capabilities to build and enhance modern business applications for ERP systems in the company. In this role, you will guide team in product development, architecture and technology adherence to ensure delivered solutions are secure and scalable. You will also lead team development and cross team relationships and delivery to advance the company's engineering delivery.
Key Accountabilities:
- Lead a team of engineering professionals that design, develop, deploy and enhance the new and existing software solutions.
- Provide direction to the team to build highly scalable and resilient software products and platforms to support business needs.
- Provide input and guidance to the delivery team across technology and business capabilities to accomplish team deliverables.
- Provide support to software engineers dedicated to products in other portfolios within ERP teams.
- Partner with the engineering community to coach engineers, share relevant technical approaches, identify new trends, modern skills and present code methodologies.
Qualifications:
MINIMUM QUALIFICATIONS:
- Bachelor’s degree in a related field or equivalent experience
- Minimum of four years of related work experience
PREFERRED QUALIFICATIONS:
- Confirmed hands on technical experience with technologies including cloud, software development and continuous integration and continuous delivery
- 2 years of supervisory experience
- Experience leading engineers in the area of ERP basis, Code Development (ABAP, HTML 5, Python, Java, etc..), or Design Thinking.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Purpose and Impact:
The Enterprise Resource Planning (ERP) Engineer will architect, design and develop new and existing digital solutions for the organization's ERP digital portfolio. In this role, you will discover and deliver solutions to complex and varied problems to enable the company's digital future. You will bring order to ambiguous scenarios and apply in depth and broad knowledge of architectural, engineering and security practices to ensure your solutions are scalable, resilient and robust and will share knowledge on modern practices and technologies to the shared engineering community.
Key Accountabilities:
- Apply software engineering patterns and principles to design, develop, test, integrate, maintain and troubleshoot complex and varied software solutions and incorporate security practices in newly developed and maintained applications.
- Participate in peer code review sessions to review code, coach peers and ensure code quality.
- Take the lead in the assigned agile team to adopt agile philosophies, facilitate agile ceremonies and identify continuous improvement opportunities.
- Incorporate the company's engineering and development best practices within the full software development lifecycle including coding standards, code reviews, source control management, building processes, testing and security principles, to deliver high quality code rapidly.
- Collaborate to lead demonstration and continuous feedback sessions to improve development and help drive the long-term vision.
Qualifications:
MINIMUM QUALIFICATIONS:
- Bachelor’s degree in a related field or equivalent experience
- Confirmed experience automating pipelines for continuous integration, testing, delivery and security
- Confirmed experience architecting applications, databases, services or integrations
- Minimum of four years of related work experience
- Experience working in ERP basis, Code Development (ABAP, HTML 5, Python, Java, etc..), or Design Thinking.
- Experience working in Azure, AWS, or BTP cloud environments.
- 2 positions- 1 ERP SAP and 1 for open-source technologies
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
As a Kafka Administrator at Cargill you will work across the full set of data platform technologies spanning on-prem and SAS solutions empowering highly performant modern data centric solutions. Your work will play a critical role in enabling analytical insights and process efficiencies for Cargill’s diverse and complex business environments. You will work in a small team who shares your passion for building, configuring, and supporting platforms while sharing, learning and growing together.
- Develop and recommend improvements to standard and moderately complex application support processes and procedures.
- Review, analyze and prioritize incoming incident tickets and user requests.
- Perform programming, configuration, testing and deployment of fixes or updates for application version releases.
- Implement security processes to protect data integrity and ensure regulatory compliance.
- Keep an open channel of communication with users and respond to standard and moderately complex application support requests and needs.
MINIMUM QUALIFICATIONS
- 2-4 year of minimum experience
- Knowledge of Kafka cluster management, alerting/monitoring, and performance tuning
- Full ecosystem Kafka administration (kafka, zookeeper, kafka-rest, connect)
- Experience implementing Kerberos security
- Preferred:
- Experience in Linux system administration
- Authentication plugin experience such as basic, SSL, and Kerberos
- Production incident support including root cause analysis
- AWS EC2
- Terraform
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
The Cloud Engineer will design and develop the capabilities of the company cloud platform and the automation of application deployment pipelines to the cloud platform. In this role, you will be an essential partner and technical specialist for cloud platform development and Operations.
Key Accountabilities
Participate in a dynamic development environment to engineer evolving customer solutions on Azure. Support SAP Application teams for their requirements related to Application and release management. Develop automation capabilities in the cloud platform to enable provisioning and upgrades of cloud services.
Design continuous integration delivery pipelines with infrastructure as code, automation, and testing capabilities to facilitate automated deployment of applications.
Develop testable code to automate Cloud platform capabilities and Cloud platform observability tools. Engineering support to implementation/ POC of new tools and techniques.
Independently handle support of critical SAP Applications Infrastructure deployed on Azure. Other duties as assigned.
Qualifications
Minimum Qualifications
Bachelor’s degree in a related field or equivalent experience
Minimum of 5 years of related work experience
PREFERRED QUALIFICATIONS
Supporting complex application development activities in DevOps environment.
Building and supporting fully automated cloud platform solutions as Infrastructure as Code. Working with cloud services platform primarily on Azure and automating the Cloud infrastructure life cycle with tools such as Terraform, GitHub Actions.
With scripting and programming languages such as Python, Go, PowerShell.
Good knowledge of applying Azure Cloud adoption framework and Implementing Microsoft well architected framework.
Experience with Observability Tools, Cloud Infrastructure security services on Azure, Azure networking topologies and Azure Virtual WAN.
Experience automating Windows and Linux operating system deployments and management in automatically scaling deployments.
Good to have:
1) Managing cloud infra using IAC methods- terraform and go/golang
2) Knowledge about complex enterprise networking- LAN, WAN, VNET, VLAN
3) Good understanding of application architecture- databases, tiered architecture
Similar companies
Cloud Pencils
About the company
Jobs
1
Cargill
About the company
Jobs
2
Param Business Solutions
About the company
Jobs
4
Financial/Banking Services
About the company
Jobs
0
www.deliverysolutions.co
About the company
Jobs
0
ProductSC Business Consulting and Services
About the company
Jobs
1
Inflocuris Consulting Pvt Ltd
About the company
Jobs
1
stipe solutions pvt ltd
About the company
Jobs
3
Karbh IT Solution
About the company
Jobs
1
Salescom Services
About the company
Jobs
0