

XpressBees – a logistics company started in 2015 – is amongst the fastest growing
companies of its sector. While we started off rather humbly in the space of
ecommerce B2C logistics, the last 5 years have seen us steadily progress towards
expanding our presence. Our vision to evolve into a strong full-service logistics
organization reflects itself in our new lines of business like 3PL, B2B Xpress and cross
border operations. Our strong domain expertise and constant focus on meaningful
innovation have helped us rapidly evolve as the most trusted logistics partner of
India. We have progressively carved our way towards best-in-class technology
platforms, an extensive network reach, and a seamless last mile management
system. While on this aggressive growth path, we seek to become the one-stop-shop
for end-to-end logistics solutions. Our big focus areas for the very near future
include strengthening our presence as service providers of choice and leveraging the
power of technology to improve efficiencies for our clients.
Job Profile
As a Lead Data Engineer in the Data Platform Team at XpressBees, you will build the data platform
and infrastructure to support high quality and agile decision-making in our supply chain and logistics
workflows.
You will define the way we collect and operationalize data (structured / unstructured), and
build production pipelines for our machine learning models, and (RT, NRT, Batch) reporting &
dashboarding requirements. As a Senior Data Engineer in the XB Data Platform Team, you will use
your experience with modern cloud and data frameworks to build products (with storage and serving
systems)
that drive optimisation and resilience in the supply chain via data visibility, intelligent decision making,
insights, anomaly detection and prediction.
What You Will Do
• Design and develop data platform and data pipelines for reporting, dashboarding and
machine learning models. These pipelines would productionize machine learning models
and integrate with agent review tools.
• Meet the data completeness, correction and freshness requirements.
• Evaluate and identify the data store and data streaming technology choices.
• Lead the design of the logical model and implement the physical model to support
business needs. Come up with logical and physical database design across platforms (MPP,
MR, Hive/PIG) which are optimal physical designs for different use cases (structured/semi
structured). Envision & implement the optimal data modelling, physical design,
performance optimization technique/approach required for the problem.
• Support your colleagues by reviewing code and designs.
• Diagnose and solve issues in our existing data pipelines and envision and build their
successors.
Qualifications & Experience relevant for the role
• A bachelor's degree in Computer Science or related field with 6 to 9 years of technology
experience.
• Knowledge of Relational and NoSQL data stores, stream processing and micro-batching to
make technology & design choices.
• Strong experience in System Integration, Application Development, ETL, Data-Platform
projects. Talented across technologies used in the enterprise space.
• Software development experience using:
• Expertise in relational and dimensional modelling
• Exposure across all the SDLC process
• Experience in cloud architecture (AWS)
• Proven track record in keeping existing technical skills and developing new ones, so that
you can make strong contributions to deep architecture discussions around systems and
applications in the cloud ( AWS).
• Characteristics of a forward thinker and self-starter that flourishes with new challenges
and adapts quickly to learning new knowledge
• Ability to work with a cross functional teams of consulting professionals across multiple
projects.
• Knack for helping an organization to understand application architectures and integration
approaches, to architect advanced cloud-based solutions, and to help launch the build-out
of those systems
• Passion for educating, training, designing, and building end-to-end systems.

About xpressbees
About
Similar jobs
Responsibilities:
- Work experience in IT operations in mid to enterprise size environments.
- - Must have a strong understanding of industry trends and Customized Drupal development (content management system).
- - Hands on Experience working with Drupal, PHP/MySQL, JavaScript, and jQuery, JSON/XML formats.
- - Good experience in developing Customized Modules using Drupal and integrate it with the system.
- - Must have a good experience as a Backend Drupal Developer, with good understanding to consume data from Backend layer and pass it to Frontend.
- - Responsible for designing and implementing the functionality and turn it into a working feature.
- - Good understanding of Site Development.
- - Ensuring the communication between Frontend and Backend layers.
- - Responsible for High Performance and availability of the system and managing all technical aspects of Drupal CMS.
- - Work closely with front-end developers and customers to ensure an effective, visually appealing, and intuitive implementation.
- - Good communication skills.

We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.
Key Responsibilities:
- Design, develop, test, and maintain scalable ETL data pipelines using Python.
- Work extensively on Google Cloud Platform (GCP) services such as:
- Dataflow for real-time and batch data processing
- Cloud Functions for lightweight serverless compute
- BigQuery for data warehousing and analytics
- Cloud Composer for orchestration of data workflows (based on Apache Airflow)
- Google Cloud Storage (GCS) for managing data at scale
- IAM for access control and security
- Cloud Run for containerized applications
- Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
- Implement and enforce data quality checks, validation rules, and monitoring.
- Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
- Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
- Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
- Document pipeline designs, data flow diagrams, and operational support procedures.
Required Skills:
- 4–8 years of hands-on experience in Python for backend or data engineering projects.
- Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
- Solid understanding of data pipeline architecture, data integration, and transformation techniques.
- Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
- Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).
Responsibilities:
- Lead the design and implementation of Pega Decisioning solutions, ensuring alignment with business objectives and industry best practices.
- Collaborate with cross-functional teams to identify business requirements and develop decisioning strategies that drive business outcomes.
- Develop and maintain Pega Decisioning architectures, ensuring scalability, performance, and reliability.
- Provide technical leadership and guidance to junior team members, ensuring adherence to Pega best practices and standards.
- Work closely with stakeholders to communicate solution designs, plans, and progress, ensuring transparency and alignment.
- Troubleshoot and resolve complex technical issues, providing timely and effective solutions.
Requirements:
- Minimum 8 to 10 years of experience in Pega Decisioning, with a strong background in designing and implementing Pega Decisioning solutions.
- Pega Lead Decision Architect (LDA) certification required.
- Proven experience in leading complex Pega projects, with a strong understanding of Pega architecture and decisioning capabilities.
- Excellent problem-solving skills, with the ability to analyze complex business problems and develop effective solutions.
- Strong communication and interpersonal skills, with the ability to work effectively with stakeholders and team members.
- Bachelor's degree in Computer Science, Engineering, or a related field.
Good to Have Skills:
- Experience with Pega's Next-Generation Decisioning capabilities.
- Knowledge of industry-specific regulations and standards (e.g., GDPR, CCPA).
- Experience with Agile methodologies and DevOps practices.
- Strong understanding of data science and analytics concepts, with experience in integrating Pega Decisioning with external data sources.
- Certification in related Pega products (e.g., Pega Platform, Pega Customer Decision Hub).
Contacting current and potential clients to inform them regarding the product and service related.
Answering all queries and questions regarding the company and product.
Understanding customers’ requirements by asking questions and closing the deal.
Keeping the customer database maintained and updated.
Facilitating companies’ sales by going the extra mile and meeting a sales quota.
Keeping a record of all the sales calls and notes of useful information.
Requirements
1+ years of experience as a Tele caller or any similar role.
Proven track record of successfully achieving the sales quota.
Knowledge of computer programs like CRM and telephone systems.
Excellent interpersonal and communication, research, and record-keeping skills.
Excellent negotiation skills and the ability to resolve issues.
Candidates with the knowledge of BFSI and Experience in BFSI and BPO should be given preference.
Skills Required:
Must have strong communication skills
Prior Experience in working as a Tele caller or similar roles
Proven or sounds ability in Tele sales
Should aware of CRM
Should have sound knowledge of
Effective Communication skills (English)
Role: Tele caller
Industry Type: BFSI
Employment Type: Full Time,
Day Shift: 9:30 am to 6:00pm
Salary: 10k to 20 K
We are seeking a talented and creative Motion Graphic and Video Editor with expertise in video editing, motion graphics, and preferably VFX.
The ideal candidate should have a Bachelor’s degree in Motion Graphics or a related field and a strong passion
for creating engaging and visually captivating content.
Key Responsibilities:
Edit and assemble raw video footage into polished final products for social media and campaigns.
Design and create motion graphics for promotional, educational, or commercial projects.
Collaborate with the creative team to develop storyboards, visual concepts, and design elements.
Incorporate animations, transitions, and effects to enhance video presentations.
(Optional) Utilize VFX skills to add special effects and elevate the visual quality of projects.
Manage multiple projects simultaneously while meeting deadlines.
Ensure all content aligns with the brand’s vision and objectives.
Qualifications:
Bachelor’s degree in Motion Graphics, Animation, or a related field.
Proficiency in video editing software (e.g., Adobe Premiere, Final Cut Pro).
Advanced knowledge of motion graphics tools (e.g., After Effects, Cinema 4D).
Strong understanding of VFX techniques (preferred but not required).
Excellent storytelling skills and creative problem-solving abilities.
Attention to detail and ability to work both independently and within a team
Recognizing the manpower needs and taking them to the HR Manager for approval
• Creating job descriptions for various roles and advertising them on various
platforms
• Managing HR activities like meetings, interviews, and other schedules.
• Managing and handling the orientation of new employees.
• Assessing the training needs and coordinating the training and development
programs for employees with HR Heads.
• Provide support to employees in various HR related topics such as leaves,
compensation etc. and resolve issues and problems.
• Process, verify and maintain documentation relating to HR activities such as
staffing, training and performance evaluations.
• Conduct employee onboarding and help plan training & development
• Maintain employee files and records in electronic and paper form
• Overlooking the daily operations of the HR department.
• Exit interviews and Full & Final Settlement.
• Payroll, Time office and attendance management.
Requirements
• Must have proven experience working as an HR executive or its equivalent.
• Experienced with recruiting and its full cycle.
• Good knowledge of employment/labor laws
• Outstanding knowledge of MS Office
• Excellent verbal and written communication skills
• Excellent communication, interpersonal and collaboration skills
• Strong analytical and problem-solving skills
• Ability to prioritize and multi-task
• High ethical conduct
• BSc/BA in Business administration or relevant field
• Additional HR training is an advantag


Envoy combines technology and global immigration services to offer the only immigration management platform that makes it seamless for companies to hire and manage an international workforce. We empower companies to acquire the best talent regardless of where they are in the world; helps mobilize employees around the world to take advantage of business opportunities; and enables the management of entire global workforces, providing a strategic, proactive view into workforce and financial forecasting and compliance. We are a fast-growing, award-winning technology company, a leader in our space, and are backed by some of the country’s leading venture capital and growth equity firms.
Our Engineering Team
We have a passionate product engineering team that works on complex technical challenges, employs creativity, and constantly learns a variety of frameworks, tools, and technologies. While we are dedicated to delivering excellent product, we also believe having fun along the way motivates us to pour our heart into what we do. Our team has mastered development and delivery process, allowing the team to focus on designing, architecting, and crafting masterpieces.
We are growing rapidly and expanding our team in India, join our product engineering team to be part of this exciting journey.
“Code is like humor. When you have to explain it, it’s bad.” – Cory House
Skills Required
- 2+ years of strong programming experience on .NET platform
- Experience with C#, ASP.NET Web API
- Experience with Angular or any front-end framework or passion to learn Angular
- Hands on experience with SQL
- Knowledge and experience with HTML, CSS, JavaScript
Expectations
- Quality is the key driver to successful delivery, ensure highly testable and quality deliverables
- Passion to learn and acquire new skills
Envoy Glonal is an equal opportunity employer and will recruit, hire, train and promote into all job levels the most qualified applicants without regard to race, color, religion, sex, national origin, age, disability, ancestry, sexual orientation, gender identification, veteran status, pregnancy, or any other protected classification.
❖ Performing requirement analyses by interacting with BA, PM, and
Architect.
❖ Developing high-level and detailed designs.
❖ Design, Develop, Test, Implement and Maintain high-volume, lowlatency applications for critical systems and delivering highavailability and performance.
❖ Writing well designed, testable, reusable, efficient code.
❖ Conducting configuration of your own work.
❖ Reviewing the work of other developers and providing feedback.
❖ Mentor and manage the dev team.
❖ Collaboration with testing team for Integration Testing
we will provide upto 30LPA...

Hi All,
We are hiring!!
Company: SpringML India Pvt Ltd.
Role:Lead Data Engineer
Location: Hyderabad
Website: https://springml.com/">https://springml.com/
About Company:
At SpringML, we are all about empowering the 'doers' in companies to make smarter decisions with their data. Our predictive analytics products and solutions apply machine learning to today's most pressing business problems so customers get insights they can trust to drive business growth.
We are a tight-knit, friendly team of passionate and driven people who are dedicated to learning, get excited to solve tough problems and like seeing results, fast. Our core values include placing our customers first, empathy and transparency, and innovation. We are a team with a focus on individual responsibility, rapid personal growth, and execution. If you share similar traits, we want you on our team.
What's the opportunity?
SpringML is looking to hire a top-notch Lead Data Engineer who is passionate about working with data and using the latest distributed framework to process large dataset.
As a Lead Data Engineer, your primary role will be to design and build data pipelines. You will be focused on helping client projects on data integration, data prep and implementing machine learning on datasets.
In this role, you will work on some of the latest technologies, collaborate with partners on early win, consultative approach with clients, interact daily with executive leadership, and help build a great company. Chosen team members will be part of the core team and play a critical role in scaling up our emerging practice.
Responsibilities:
- Ability to work as a member of a team assigned to design and implement data integration solutions.
- Build Data pipelines using standard frameworks in Hadoop, Apache Beam and other open-source solutions.
- Learn quickly – ability to understand and rapidly comprehend new areas – functional and technical – and apply detailed and critical thinking to customer solutions.
- Propose design solutions and recommend best practices for large scale data analysis
Skills:
- B.tech degree in computer science, mathematics or other relevant fields.
- 6+years of experience in ETL, Data Warehouse, Visualization and building data pipelines.
- Strong Programming skills – experience and expertise in one of the following: Java, Python, Scala, C.
- Proficient in big data/distributed computing frameworks such as Apache Spark, Kafka,
- Experience with Agile implementation methodology

Minimum 2 years of MERN development experience.
Strong skills in working with Node.js and MongoDB frameworks/ libraries. Ability to lead the entire product development from inception to completion.
Essential Skills -
Web/ Mobile development experience:
3-5 years of tech experience of having built scalable web/mobile applications.
Experienced in - NodeJs and its libraries MongoDB, Redis, Firebase Javascript frontend framework/libraries like Angular JS, React
Building new RESTful services and APIs
Experienced in working with AWS services like S3, EC2, and Devops GitHub, Docker
Essential Qualities :
Self starter wanting to learn new technologies, tinker, experiment and implement new products. Wanting an entrepreneurial and cross functional experience. Motivation to mentor younger incoming talent and lead projects.

