
Job Title: Data Engineer
Cargill’s size and scale allows us to make a positive impact in the world. Our purpose is to nourish the world in a safe, responsible and sustainable way. We are a family company providing food, ingredients, agricultural solutions and industrial products that are vital for living. We connect farmers with markets so they can prosper. We connect customers with ingredients so they can make meals people love. And we connect families with daily essentials — from eggs to edible oils, salt to skincare, feed to alternative fuel. Our 160,000 colleagues, operating in 70 countries, make essential products that touch billions of lives each day. Join us and reach your higher purpose at Cargill.
Job Purpose and Impact
As a Data Engineer at Cargill you work across the full stack to design, develop and operate high performance and data centric solutions using our comprehensive and modern data capabilities and platforms. You will play a critical role in enabling analytical insights and process efficiencies for Cargill’s diverse and complex business environments. You will work in a small team that shares your passion for building innovative, resilient, and high-quality solutions while sharing, learning and growing together.
Key Accountabilities
Collaborate with business stakeholders, product owners and across your team on product or solution designs.
· Develop robust, scalable and sustainable data products or solutions utilizing cloud-based technologies.
· Provide moderately complex technical support through all phases of product or solution life cycle.
· Perform data analysis, handle data modeling, and configure and develop data pipelines to move and optimize data assets.
· Build moderately complex prototypes to test new concepts and provide ideas on reusable frameworks, components and data products or solutions and help promote adoption of new technologies.
· Independently solve moderately complex issues with minimal supervision, while escalating more complex issues to appropriate staff.
· Other duties as assigned
Qualifications
MINIMUM QUALIFICATIONS
· Bachelor’s degree in a related field or equivalent experience
· Minimum of two years of related work experience
· Other minimum qualifications may apply
PREFERRED QUALIFCATIONS
· Experience developing modern data architectures, including data warehouses, data lakes, data meshes, hubs and associated capabilities including ingestion, governance, modeling, observability and more.
· Experience with data collection and ingestion capabilities, including AWS Glue, Kafka Connect and others.
· Experience with data storage and management of large, heterogenous datasets, including formats, structures, and cataloging with such tools as Iceberg, Parquet, Avro, ORC, S3, HFDS, HIVE, Kudu or others.
· Experience with transformation and modeling tools, including SQL based transformation frameworks, orchestration and quality frameworks including dbt, Apache Nifi, Talend, AWS Glue, Airflow, Dagster, Great Expectations, Oozie and others
· Experience working in Big Data environments including tools such as Hadoop and Spark
· Experience working in Cloud Platforms including AWS, GCP or Azure
· Experience of streaming and stream integration or middleware platforms, tools, and architectures such as Kafka, Flink, JMS, or Kinesis.
· Strong programming knowledge of SQL, Python, R, Java, Scala or equivalent
· Proficiency in engineering tooling including docker, git, and container orchestration services
· Strong experience of working in devops models with demonstratable understanding of associated best practices for code management, continuous integration, and deployment strategies.
· Experience and knowledge of data governance considerations including quality, privacy, security associated implications for data product development and consumption.
Equal Opportunity Employer, including Disability/Vet.

Similar jobs
About the Role
The Operation Executive is responsible for planning, coordinating, and executing
various events, BTL activities, vendor visits, branding material procurement etc.
ensuring quality execution. This role requires a smart and organized individual who
can manage multiple tasks simultaneously, work under pressure, and deliver
exceptional execution as per SOP.
Key Responsibilities
Event Planning and Coordination
Collaborate with BD and CS team to understand their event goals, themes,
and requirements.
Develop detailed plans, timelines, checklists and production schedules.
Coordinate with vendors, suppliers, and contractors to secure necessary
services and materials.
Conduct site visits and assessments to ensure venue suitability and ensure
alignment with client expectations.
Logistics Management
Oversee the setup, execution, and teardown of events and activities, ensuring
all elements are in place and functioning correctly as per SOP.
Manage logistics, including transportation, accommodation, and catering
arrangements etc.
Ensure compliance with health, safety, and regulatory standards during all
activities.
Technical Coordination
Work closely with technical teams to ensure proper setup and operation of
audio-visual equipment, lighting, staging, and other technical aspects.
Troubleshoot technical issues during events/activities and implement quick
solutions to minimize disruptions.
Budget Management
Track and manage event/activity and procurement budgets, ensuring all
expenses are tracked and kept within allocated limits.
Negotiate with vendors and suppliers to secure competitive rates and
optimize costs.
Team Collaboration
Lead and coordinate event/activity staff, including volunteers, to ensure
smooth execution of event tasks.
Foster a positive and collaborative team environment.
Client and Stakeholder Relations
Serve as the on-ground point of contact for clients, providing regular updates
and addressing concerns.
Maintain strong relationships to ensure client satisfaction and seamless
delivery of requirements.
Professionally handle client feedback and implement improvements as
needed.
Post-Event Evaluation
Conduct post-event evaluations to assess event success and identify areas
for improvement.
Prepare detailed event reports for clients and internal stakeholders.
What We’re Looking For
1-3 years of experience in event operations, logistics management, or BTL
activity execution.
Strong organizational and multitasking skills to manage multiple projects
simultaneously.
Ability to work in a fast-paced, high-pressure, real-time environment.
Excellent communication and vendor management skills.
Proficiency in Microsoft Office Suite (Excel, Word, and PowerPoint).
Problem-solving mindset with the ability to resolve issues efficiently on the
ground.
Self-motivated, detail-oriented, and capable of working independently with
minimal supervision.
Must possess a two-wheeler and have a valid driver’s license for local travel.

About Us
MatchMove is a leading embedded finance platform that empowers businesses to embed financial services into their applications. We provide innovative solutions across payments, banking-as-a-service, and spend/send management, enabling our clients to drive growth and enhance customer experiences.
Are You The One?
As a Technical Lead Engineer - Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business.
You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight.
You will contribute to
- Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services.
- Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark.
- Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services.
- Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases.
- Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment.
- Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM).
- Using Generative AI tools to enhance developer productivity — including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights.
- Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines.
Responsibilities
- Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR.
- Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication.
- Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation.
- Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards).
- Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations.
- Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership.
- Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform.
Requirements
- At-least 6 years of experience in data engineering.
- Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum.
- Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs.
- Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation.
- Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale.
- Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions.
- Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments.
- Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains — with strong engineering hygiene.
- Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders.
Brownie Points
- Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements.
- Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection.
- Familiarity with data contracts, data mesh patterns, and data as a product principles.
- Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases.
- Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3.
- Experience building data platforms for ML/AI teams or integrating with model feature stores.
MatchMove Culture:
- We cultivate a dynamic and innovative culture that fuels growth, creativity, and collaboration. Our fast-paced fintech environment thrives on adaptability, agility, and open communication.
- We focus on employee development, supporting continuous learning and growth through training programs, learning on the job and mentorship.
- We encourage speaking up, sharing ideas, and taking ownership. Embracing diversity, our team spans across Asia, fostering a rich exchange of perspectives and experiences.
- Together, we harness the power of fintech and e-commerce to make a meaningful impact on people's lives.
Grow with us and shape the future of fintech and e-commerce. Join us and be part of something bigger!
We are seeking an experienced and dynamic Corporate Sales Professional with strong expertise in new business development and client acquisition within the insurance broking industry. The ideal candidate should have a proven track record in driving corporate sales, building client relationships, and delivering revenue growth.
Key Responsibilities
- Drive new business acquisition by identifying, approaching, and converting potential corporate clients.
- Develop and execute effective sales strategies for corporate insurance solutions.
- Build and maintain long-term relationships with clients, ensuring superior service delivery.
- Collaborate with internal teams to design tailored insurance solutions for corporate clients.
- Meet and exceed monthly/quarterly/annual sales and revenue targets.
- Monitor market trends, competitor activities, and emerging opportunities in the corporate insurance space.
- Prepare and present proposals, negotiations, and closure of deals with CXOs and decision-makers.
- Ensure compliance with regulatory guidelines and company policies.
Key Requirements
- 7+ years of experience in corporate sales within the insurance broking industry.
- If from an insurance company, must have direct sales (corporate clients) exposure.
- Strong network and established relationships with corporate clients.
- Demonstrated success in new business development & client acquisition.
- Excellent communication, negotiation, and presentation skills.
- Ability to work independently with a target-driven approach.
- Strong understanding of corporate insurance products and solutions.
What We Offer
- Competitive salary with performance-linked incentives.
- Opportunity to work with a reputed and growing insurance broking firm.
- Exposure to diverse industries and corporate clients.
- Career growth and professional development opportunities.


You will be part of the core engineering team that is working on developing AI/ML models, Algorithms, and Frameworks in the areas of Video Analytics, Business Intelligence, IoT Predictive Analytics.
For more information visit www.gyrus.ai
Candidate must have the following qualifications
- Engineering or Masters degree in CS, EC, EE or related domains
- Proficient in OpenCV
- Profficiency in Python programming
- Exposure to one of the AI platforms like Tensorflow, Caffe, PyTorch
- Must have trained and deployed at least one fairly big AI model
- Exposure to AI models for Audio/Image/Video Analytics
- Exposure to one of the Cloud Computing platforms AWS/GCP
- Strong mathematical background with special emphasis towards Linear Algebra and Statistics


We are looking for a UI Developer to join our team! Headquartered in New York, LodgIQ delivers a revolutionary SaaS platform for Algorithmic Pricing and Revenue Management for the hospitality industry by incorporating machine learning and artificial intelligence. For more information, visit http://www.lodgiq.com">http://www.lodgiq.com
Backed by Highgate Ventures and Trilantic Capital Partners, LodgIQ is a well-funded company, seeking for a motivated and entrepreneurial UI Developer to join its Product/ Engineering team in India. Qualified A+ candidates will be offered an excellent compensation and benefit package.
Requirements:
- In-depth knowledge of HTML/Javascript/React.JS
- Strong flair for UI and UX
- Understanding of Web Application architectures e.g. Django/Flask
- Familiarity with Cloud Environments (AWS, EC2, S3, IAM)
- Working knowledge of NoSQL databases such as MongoDB
- Proficiency in consuming and developing REST APIs with JSON data
Specific Job Knowledge, Skills & Abilities:
- Real world experience with large-scale data on AWS or similar platform
- Must be a self-starter and an effective data wrangler
- Intellectual curiosity and strong desire to learn new Big Data and Machine Learning technologies
- Deadline driven, and capable of delivering projects on time under a fast paced, high growth environment
- Willingness to work with unstructured and messy data
- Bachelor’s degree or Masters degree in relevant quantitative fields.
Requirements:
- 4+ years of experience in developing database applications and reports
- Expert in SQL in various databases with good exposure to MySQL
- Hands on experience in query tuning for applications, APIs and reports
- Expertise in developing database procedures, triggers and events.
- Experience in AWS Aurora.
- Excellent communication skills.
Job Profile: Software Development Engineer IV - iOS - StoreFront team
Location: Bangalore | Karnataka
ABOUT THE TEAM & ROLE:
Swiggy's StoreFront Engineering team helps customers enjoy personalized discovery and purchase experience across multiple product lines (Stores, Food, Genie and Instamart). The team is enabling this by developing thoughtfully crafted applications, smart cataloging, relevance-based search & intent-driven merchandising, checkout management solutions, and payment systems.
We are looking for engineers who have hands-on experience in building highly reliable distributed systems and have deep expertise in database design & performance tuning. Knowledge of Machine Learning and other Predictive Modeling techniques will be added strength. Few interesting problems we are solving include:
1. Client-facing Applications
2. Smart Catalog & Category Intelligence
3. Personalized Search & Merchandising experience
4. Payments
5. Pricing
6. Order Management System
At Swiggy, SDE IV(s) play an integral role in owning end-to-end Design/Architecture of complex systems. They co-own the technology vision of the respective team and significantly contribute to the overall success of the team. They partner with the product/business teams to understand the product features and specifications, translate them into high level and low-level design thereby facilitating the team in design and development of mission critical applications.
What qualities are we looking for?
- Hands on experience in mobile application development for at least
- Hands on working experience in Swift
- Experience in multithreaded programming and memory optimization
- Ability to learn and grow in a fast paced setup
- Working knowledge of iOS Architectural Components and Design Patterns
- Very good debugging skills
- Good knowledge in implementing Pixel Perfect designs
- Good in Data Structures and Algorithm
What will you get to do here?
- Coming up with best practices to help the team achieve their technical tasks and continually thrive in improving the technology of the product/team
- Driving the adoption of best practices & regular Participation in code reviews, design reviews, architecture discussions
- Experiment with new & relevant technologies and tools, and drive adoption while measuring yourself on the impact you can create
- Implementation of long term technology vision for your team
- Responsible for complete architecture of your product
- Creating architectures & designs for new solutions around existing/new areas
- Decide technology & tool choices for your team & be responsible for them
Responsibilities:
- 4+ years of work experience
- Build intuition of the product as a whole
- Gather requirements from stakeholders, derive problem statements and plan design engagements. Also, proactively launch efforts to improve different aspects of the product
- Conduct research to understand real world practices and usage behaviours employing suitable methods and accessories
- Analyse findings, derive meaningful and actionable insights from research
- Create functions, features and formulate optimised task flows to solve business problems and cater to user needs in collaboration with other stakeholders
- Interpret the solution on to an interface using the principles of digital psychology, visual design and interaction design for web or/and mobile
- Present design propositions and solutions to stakeholders using wireframes, mock-ups or high fidelity prototypes, receive feedback, advocate for best practices and iterate based on feedback
- Decipher usage pattern with the help of analytics and iterate design to be more efficient
- Cultivate efficient frameworks and practices for the enrichment of the team and contribute for the improvement of design system
Mandatory Skills
- Experienced and well versed in conducting different methods of research and usability testing
- Good knowledge and understanding of design principles, interaction design standards and UI patterns for web and mobile
- In depth understanding of the elements of a design system
- Great aptitude for solving complex business problems
Good to have skills
- Design specialisations - Illustration, motion design, graphic design, data visualisation, service design
- Domain specialisations - E-comm & B-B
- Cross-functional skills - Technological
Candidates with published/ written case studies will be preferred





