XressBees – a logistics company started in 2015 – is amongst the fastest growing companies of its sector. Our
vision to evolve into a strong full-service logistics organization reflects itself in the various lines of business like B2C
logistics 3PL, B2B Xpress, Hyperlocal and Cross border Logistics.
Our strong domain expertise and constant focus on innovation has helped us rapidly evolve as the most trusted
logistics partner of India. XB has progressively carved our way towards best-in-class technology platforms, an
extensive logistics network reach, and a seamless last mile management system.
While on this aggressive growth path, we seek to become the one-stop-shop for end-to-end logistics solutions. Our
big focus areas for the very near future include strengthening our presence as service providers of choice and
leveraging the power of technology to drive supply chain efficiencies.
Job Overview
XpressBees would enrich and scale its end-to-end logistics solutions at a high pace. This is a great opportunity to join
the team working on forming and delivering the operational strategy behind Artificial Intelligence / Machine Learning
and Data Engineering, leading projects and teams of AI Engineers collaborating with Data Scientists. In your role, you
will build high performance AI/ML solutions using groundbreaking AI/ML and BigData technologies. You will need to
understand business requirements and convert them to a solvable data science problem statement. You will be
involved in end to end AI/ML projects, starting from smaller scale POCs all the way to full scale ML pipelines in
production.
Seasoned AI/ML Engineers would own the implementation and productionzation of cutting-edge AI driven algorithmic
components for search, recommendation and insights to improve the efficiencies of the logistics supply chain and
serve the customer better.
You will apply innovative ML tools and concepts to deliver value to our teams and customers and make an impact to
the organization while solving challenging problems in the areas of AI, ML , Data Analytics and Computer Science.
Opportunities for application:
- Route Optimization
- Address / Geo-Coding Engine
- Anomaly detection, Computer Vision (e.g. loading / unloading)
- Fraud Detection (fake delivery attempts)
- Promise Recommendation Engine etc.
- Customer & Tech support solutions, e.g. chat bots.
- Breach detection / prediction
An Artificial Intelligence Engineer would apply himself/herself in the areas of -
- Deep Learning, NLP, Reinforcement Learning
- Machine Learning - Logistic Regression, Decision Trees, Random Forests, XGBoost, etc..
- Driving Optimization via LPs, MILPs, Stochastic Programs, and MDPs
- Operations Research, Supply Chain Optimization, and Data Analytics/Visualization
- Computer Vision and OCR technologies
The AI Engineering team enables internal teams to add AI capabilities to their Apps and Workflows easily via APIs
without needing to build AI expertise in each team – Decision Support, NLP, Computer Vision, for Public Clouds and
Enterprise in NLU, Vision and Conversational AI.Candidate is adept at working with large data sets to find
opportunities for product and process optimization and using models to test the effectiveness of different courses of
action. They must have knowledge using a variety of data mining/data analysis methods, using a variety of data tools,
building, and implementing models, using/creating algorithms, and creating/running simulations. They must be
comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion
for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes.
Roles & Responsibilities
● Develop scalable infrastructure, including microservices and backend, that automates training and
deployment of ML models.
● Building cloud services in Decision Support (Anomaly Detection, Time series forecasting, Fraud detection,
Risk prevention, Predictive analytics), computer vision, natural language processing (NLP) and speech that
work out of the box.
● Brainstorm and Design various POCs using ML/DL/NLP solutions for new or existing enterprise problems.
● Work with fellow data scientists/SW engineers to build out other parts of the infrastructure, effectively
communicating your needs and understanding theirs and address external and internal shareholder's
product challenges.
● Build core of Artificial Intelligence and AI Services such as Decision Support, Vision, Speech, Text, NLP, NLU,
and others.
● Leverage Cloud technology –AWS, GCP, Azure
● Experiment with ML models in Python using machine learning libraries (Pytorch, Tensorflow), Big Data,
Hadoop, HBase, Spark, etc
● Work with stakeholders throughout the organization to identify opportunities for leveraging company data to
drive business solutions.
● Mine and analyze data from company databases to drive optimization and improvement of product
development, marketing techniques and business strategies.
● Assess the effectiveness and accuracy of new data sources and data gathering techniques.
● Develop custom data models and algorithms to apply to data sets.
● Use predictive modeling to increase and optimize customer experiences, supply chain metric and other
business outcomes.
● Develop company A/B testing framework and test model quality.
● Coordinate with different functional teams to implement models and monitor outcomes.
● Develop processes and tools to monitor and analyze model performance and data accuracy.
● Develop scalable infrastructure, including microservices and backend, that automates training and
deployment of ML models.
● Brainstorm and Design various POCs using ML/DL/NLP solutions for new or existing enterprise problems.
● Work with fellow data scientists/SW engineers to build out other parts of the infrastructure, effectively
communicating your needs and understanding theirs and address external and internal shareholder's
product challenges.
● Deliver machine learning and data science projects with data science techniques and associated libraries
such as AI/ ML or equivalent NLP (Natural Language Processing) packages. Such techniques include a good
to phenomenal understanding of statistical models, probabilistic algorithms, classification, clustering, deep
learning or related approaches as it applies to financial applications.
● The role will encourage you to learn a wide array of capabilities, toolsets and architectural patterns for
successful delivery.
What is required of you?
You will get an opportunity to build and operate a suite of massive scale, integrated data/ML platforms in a broadly
distributed, multi-tenant cloud environment.
● B.S., M.S., or Ph.D. in Computer Science, Computer Engineering
● Coding knowledge and experience with several languages: C, C++, Java,JavaScript, etc.
● Experience with building high-performance, resilient, scalable, and well-engineered systems
● Experience in CI/CD and development best practices, instrumentation, logging systems
● Experience using statistical computer languages (R, Python, SLQ, etc.) to manipulate data and draw insights
from large data sets.
● Experience working with and creating data architectures.
● Good understanding of various machine learning and natural language processing technologies, such as
classification, information retrieval, clustering, knowledge graph, semi-supervised learning and ranking.
● Knowledge and experience in statistical and data mining techniques: GLM/Regression, Random Forest,
Boosting, Trees, text mining, social network analysis, etc.
● Knowledge on using web services: Redshift, S3, Spark, Digital Ocean, etc.
● Knowledge on creating and using advanced machine learning algorithms and statistics: regression,
simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc.
● Knowledge on analyzing data from 3rd party providers: Google Analytics, Site Catalyst, Core metrics,
AdWords, Crimson Hexagon, Facebook Insights, etc.
● Knowledge on distributed data/computing tools: Map/Reduce, Hadoop, Hive, Spark, MySQL, Kafka etc.
● Knowledge on visualizing/presenting data for stakeholders using: Quicksight, Periscope, Business Objects,
D3, ggplot, Tableau etc.
● Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural
networks, etc.) and their real-world advantages/drawbacks.
● Knowledge of advanced statistical techniques and concepts (regression, properties of distributions,
statistical tests, and proper usage, etc.) and experience with applications.
● Experience building data pipelines that prep data for Machine learning and complete feedback loops.
● Knowledge of Machine Learning lifecycle and experience working with data scientists
● Experience with Relational databases and NoSQL databases
● Experience with workflow scheduling / orchestration such as Airflow or Oozie
● Working knowledge of current techniques and approaches in machine learning and statistical or
mathematical models
● Strong Data Engineering & ETL skills to build scalable data pipelines. Exposure to data streaming stack (e.g.
Kafka)
● Relevant experience in fine tuning and optimizing ML (especially Deep Learning) models to bring down
serving latency.
● Exposure to ML model productionzation stack (e.g. MLFlow, Docker)
● Excellent exploratory data analysis skills to slice & dice data at scale using SQL in Redshift/BigQuery.

About xpressbees
About
Similar jobs
Lead Java Developer (Spring Boot)
Job Type: Full-Time
Location: Bangalore / Hyderabad / Pune / Chennai
Work Mode: Hybrid
Experience: 6+ Years
Key Responsibilities:
Lead the design and development of scalable backend applications using Java (Spring Boot).
Mentor and guide a team of developers to ensure high-quality deliverables.
Take ownership of solution architecture, coding standards, and design patterns.
Develop and manage RESTful APIs and integrate third-party services.
Collaborate with front-end teams, QA, and stakeholders to align technical implementation with business goals.
Oversee deployments in hybrid cloud environments in coordination with DevOps teams.
Conduct code reviews, lead design discussions, and manage agile development processes (Scrum/Kanban).
Monitor application performance and drive improvements proactively.
Troubleshoot and resolve complex software issues across systems and services.
Required Skills:
8+ years of professional experience in Java development, with at least 2 years in a lead role.
Strong hands-on expertise in Spring Boot and microservices architecture.
Working knowledge of Node.js and JavaScript/TypeScript.
Experience with REST APIs, SQL/NoSQL databases (MySQL, PostgreSQL, MongoDB).
Familiar with CI/CD pipelines, Git, and modern DevOps practices.
Proven ability to lead distributed teams and manage deliverables in a remote/hybrid work setup.
Strong communication, leadership, and problem-solving skills.
Role : Oracle ERP consultant
Experience: 4-8 years
Involves L3 support and Development activities.
Good at Work Order Management, Item/Bill of materials, Quality Assurance, Order Management
We are seeking a dynamic and results-driven Business Development Specialist with experience in the digital marketing industry to help drive growth and expand our client base. The ideal candidate will have a deep understanding of digital marketing services, including SEO, PPC, content marketing, and social media, combined with a proven ability to generate leads, close deals, and build strong client relationships.
Key Responsibilities:
- Identify and pursue new business opportunities within the digital marketing industry.
- Develop and execute strategies to expand the company’s market presence and drive revenue growth.
- Build and maintain long-term relationships with clients and stakeholders.
- Conduct market research to identify emerging trends, client needs, and potential partnerships.
- Collaborate with internal teams (SEO, PPC, content, and social media) to ensure seamless integration of marketing solutions for clients.
- Prepare and deliver compelling sales presentations, proposals, and pitches tailored to client needs.
- Manage and track the sales pipeline, from lead generation to contract negotiation and closing.
- Meet or exceed sales and revenue targets by converting prospects into clients.
- Attend industry events, networking opportunities, and conferences to build brand presence and client relationships.
Qualifications:
- Proven experience in business development, sales, or account management within the digital marketing industry.
- Strong understanding of digital marketing services, including SEO, PPC, content marketing, social media, and web development.
- Excellent communication, negotiation, and presentation skills.
- Ability to work independently and manage multiple client relationships simultaneously.
- Strong analytical skills and the ability to understand client needs and translate them into effective marketing solutions.
- Proficiency with CRM software and digital marketing tools (e.g., HubSpot, Google Analytics, SEMrush) is a plus.
- Bachelor's degree in marketing, business, or a related field is preferred.
Why Join Us:
- Opportunity to work in a fast-paced, growing digital marketing agency.
- Collaborative team environment with opportunities for career growth and professional development.
- Competitive salary and performance-based incentives.
Job Title: Data Engineer
Cargill’s size and scale allows us to make a positive impact in the world. Our purpose is to nourish the world in a safe, responsible and sustainable way. We are a family company providing food, ingredients, agricultural solutions and industrial products that are vital for living. We connect farmers with markets so they can prosper. We connect customers with ingredients so they can make meals people love. And we connect families with daily essentials — from eggs to edible oils, salt to skincare, feed to alternative fuel. Our 160,000 colleagues, operating in 70 countries, make essential products that touch billions of lives each day. Join us and reach your higher purpose at Cargill.
Job Purpose and Impact
As a Data Engineer at Cargill you work across the full stack to design, develop and operate high performance and data centric solutions using our comprehensive and modern data capabilities and platforms. You will play a critical role in enabling analytical insights and process efficiencies for Cargill’s diverse and complex business environments. You will work in a small team that shares your passion for building innovative, resilient, and high-quality solutions while sharing, learning and growing together.
Key Accountabilities
Collaborate with business stakeholders, product owners and across your team on product or solution designs.
· Develop robust, scalable and sustainable data products or solutions utilizing cloud-based technologies.
· Provide moderately complex technical support through all phases of product or solution life cycle.
· Perform data analysis, handle data modeling, and configure and develop data pipelines to move and optimize data assets.
· Build moderately complex prototypes to test new concepts and provide ideas on reusable frameworks, components and data products or solutions and help promote adoption of new technologies.
· Independently solve moderately complex issues with minimal supervision, while escalating more complex issues to appropriate staff.
· Other duties as assigned
Qualifications
MINIMUM QUALIFICATIONS
· Bachelor’s degree in a related field or equivalent experience
· Minimum of two years of related work experience
· Other minimum qualifications may apply
PREFERRED QUALIFCATIONS
· Experience developing modern data architectures, including data warehouses, data lakes, data meshes, hubs and associated capabilities including ingestion, governance, modeling, observability and more.
· Experience with data collection and ingestion capabilities, including AWS Glue, Kafka Connect and others.
· Experience with data storage and management of large, heterogenous datasets, including formats, structures, and cataloging with such tools as Iceberg, Parquet, Avro, ORC, S3, HFDS, HIVE, Kudu or others.
· Experience with transformation and modeling tools, including SQL based transformation frameworks, orchestration and quality frameworks including dbt, Apache Nifi, Talend, AWS Glue, Airflow, Dagster, Great Expectations, Oozie and others
· Experience working in Big Data environments including tools such as Hadoop and Spark
· Experience working in Cloud Platforms including AWS, GCP or Azure
· Experience of streaming and stream integration or middleware platforms, tools, and architectures such as Kafka, Flink, JMS, or Kinesis.
· Strong programming knowledge of SQL, Python, R, Java, Scala or equivalent
· Proficiency in engineering tooling including docker, git, and container orchestration services
· Strong experience of working in devops models with demonstratable understanding of associated best practices for code management, continuous integration, and deployment strategies.
· Experience and knowledge of data governance considerations including quality, privacy, security associated implications for data product development and consumption.
Equal Opportunity Employer, including Disability/Vet.
-
SAP PS Implementation Experience: End-to-end Implementation experience in different domains - Banking, Manufacture, Civil or any other Industry.
-
Good configuration knowledge of PS structures: WBS, Network, Milestones, Cost Planning, Budgeting, Material Requirement planning, Project quotation, Time sheets, Goods issues, and other project management activities in SAP PS.
-
Must have completed at least two end-to-end implementations.
-
Experience on complete PS module cycle from project creation to settlement.
-
Integration knowledge with CO, FI and MM, SD and PP.
-
Must be proficient in handling Issues/support functions.
Job Description:
Responsibilities –
Ensure effective Design, Development, Validation and support activities in line with client needs and architectural requirements.,- Ensure continual knowledge management.,- Adherence to the organizational guidelines and processes
Technical and Professional Requirements :
Minimum 3 years of experience required
Experience in developing React Native app for iOS and Android platform,
Knowledge of React Native available inbuilt UI components and the props those UI component accept,
Should have worked with major node modules such as react-native-maps, @react-native/community geolocation, offline database, etc,
Handle build for simulator and ipa/apk for UAT/Prod release with signing. (Release Management),
Should have worked and deployed iOS App
Should have worked on sqlite
Should have knowledge of database SQL ( Select and update queries)
Roles and Responsibilities
- Build talent pipelines for current and future job openings
- Lead all sourcing strategies
- Manage our external partnerships with colleges, job boards and HR software vendors
- Implement online and offline employer branding activities
- Prepare and review our annual recruitment budget
- Oversee all stages of candidate experience (including application, interviews and communication)
- Forecast hiring needs based on business growth plans
- Manage, train and evaluate our team of recruiters
- Participate in and host recruitment events to drive awareness of our company
- Develop a network of potential future hires (e.g. past applicants and referred candidates)
- Measure key recruitment metrics, like source of hire and time-to-hire
Desired Candidate Profile
- Proven work experience as a Talent Acquisition Specialist or similar role
- Demonstrable experience managing full-cycle recruiting and employer branding initiatives
- Solid understanding of sourcing techniques and tools (e.g. social networks)
- Hands-on experience with Applicant Tracking Systems (ATSs) and HR databases
- Good team-management abilities
- Excellent communication skills with the ability to foster long-term relationships (with internal teams, external partners and candidates)
- BSc/MSc in Human Resources Management or relevant field
Job Description:
Excellent Java development skills using J2EE, J2SE, Servlets, JSP, JDBC, Java, Spring, Springboot, Hibernate, Microservices, Webservices, Extensive experience designing, developing RESTful APIs, coding experience with either Java/J2EE/Spring, Good SQL knowledge
Required experience in Core Java, Spring, Springboot, Hibernate, Microservices, rest services
Able to understand SDLC process, Extensive experience designing and, developing RESTful APIs, Extensive coding experience with either Java/J2EE/Spring
Qualification Any Graduate
We are looking for a graphic designer who can manage the entire process of defining requirements, visualizing and creating graphics including illustrations, logos, layouts and photos. You’ll be the one to shape the visual aspects of websites, social media campaigns, magazines, product packaging, exhibitions and more. Your graphics should capture the attention of those who see them and communicate the right message.
For this, you need to have a creative flair and a strong ability to translate requirements into design. If you can communicate well and work methodically as part of a team, we’d like to meet you.
Responsibilities
Study design briefs and determine requirements
Schedule projects and define budget constraints
Conceptualize visuals based on requirements
Prepare rough drafts and present ideas
Develop illustrations, logos and other designs using software or by hand
Use the appropriate colours and layouts for each graphic
Work with copywriters and creative director to produce a final design
Test graphics across various media
Amend designs after feedback
Ensure final graphics and layouts are visually appealing and on-brand









