About Rebel Foods: World's leading consumer companies are all technology / new age
companies - Amazon (retail), Airbnb (Hospitality), Uber (mobility), Netflix / Spotify
(Entertainment). The only sector, where traditional companies are still the largest ones is
restaurants - McDonald's (with a market cap of 130 BN USD). With Food Delivery growing
exponentially worldwide, there is an opportunity to build the world's most valuable restaurant company on the internet, superfast. We have the formula to be that company. Today, we can safely say we are world's largest delivery only / internet restaurant company, and by a wide margin with 4000+ individual internet restaurants, in 40+ cities and 7 countries (India, Indonesia, UAE, UK, Malaysia, Singapore, Bangladesh. It's still Day 1, but we know we are onto something very, very big. We have a once-in-a-lifetime opportunity to change the 500- year-old industry that hasn’t been disrupted at its core by technology. For more details on how we are changing the restaurant industry from the core, please refer below. It's important reading if you want to know our company better and really explore working with us
https://medium.com/@jaydeep_barman/how-to-build-1000-restaurants-in-24-months-the-rebel-method%20cb5b0cea4dc8" target="_blank">link 1
https://medium.com/faasos-story/winning-the-last-frontier-for-consumer-internet-5f2a659c43db%20https://medium.com/faasos-story/a-unique-take-on-food-tech-dcef8c51ba41" target="_blank">link 2
https://medium.com/faasos-story/a-unique-take-on-food-tech-dcef8c51ba41" target="_blank">link 4
Data Engineering @ Rebel
Data Engineering @ Rebel comprises of Data Platform, Data Science (ML/Statistics), Analytics (Business& System - MIS/Reports/BI Tools), AI/IOT (vision computing, quality check etc). We work on manyvinteresting data engineering use cases at scale which require constant innovation and disruption. Few ofvthose are -
- Personalization - Recommendation Engine, Dynamic pricing etc
- Customer engagement/retention, Campaign Management
- NLP to improve customer experiences
- Predictive modelling for inventory, resources, kitchen operations etc
- Operating metrics for kitchen machines, resources etc
- Vision Computing to do food quality check, product identification, safety checks across the kitchens
- Big data platform to ingest events, transaction data, behavioral data from millions of consumers
Future of Data Engineering @ Rebel
Rebel Engineering function is working on Software + Robotics + Automation to solve the toughestvproblems for our customers and make their food missions unique, memorable and delightful & sure. Wevbelieve in continuous adoption of emerging technologies to solve customer problems in fast andvinnovative fashion.
Data platform and Data science will remain the backbone in driving the decisions for the best customer experiences and the best operating models.
We are in the lookout for someone who can come on board to solve known/unknown business & customer use cases thru’ data platform and data science.
In this role, you must have a passion for research and data to solve problems at scale for both customers
and the company. You should be highly proficient with one or more of the following skillsets:-
- Data Science - Machine learning (regression, classification, random forest, k nearest etc), Deep learning (Neural Networks, Tensorflow/Keras etc), AI, Statistics, SQL/NoSQL DBs
- Data Platform - Big data technologies such as Spark, Hadoop, Teradata, SQL/NoSQL DBs, Storage systems, Streaming APIs, and so forth;
- Data Analytics - Data visualisation (Reporting/MIS/BI) Tools, SQLs, ETLs
We expect you to be excellent in writing efficient programs (Python/R/Java etc) and problem solving skills. You will work closely with the Business, Product and Engineering teams and will report to the Head of Data Engineering.
The Rebel Culture
We believe in empowering and growing people to perform the best at their job functions. We follow outcome-oriented, fail-fast iterative & collaborative culture to move fast in building tech solutions.
Rebel is not a usual workplace. The following slides will give you a sense of our culture, how Rebel conducts itself and who will be the best fit for our company. We suggest you go through it before making up your mind.
https://www.slideshare.net/JaydeepBarman/culture-rebel" target="_blank">link 5
About Rebel Foods
At Rebel Foods, we are challenging this status quo as we are building the world's most valuable restaurant company on the internet, superfast. The opportunity for us is immense due to the exponential growth in the food delivery business worldwide which has helped us build 'The World's Largest Internet Restaurant Company' in the last few years. Rebel Foods current presence in 7 countries (India, Indonesia, UAE, UK, Malaysia, Singapore, Bangladesh) with 15 + brands and 3500+ internet restaurants has been built on a simple system - The Rebel Operating Model. While for us it is still Day 1, we know we are in the middle of a revolution towards creating never seen before customer-first experiences. We bring you a once-in-a-lifetime opportunity to disrupt the 500-yearold industry with technology at its core.
Here, at Rebel Foods, we are using technology and automation to disrupt the traditional food industry. We are focused on building an operating system for Cloud Kitchens - using the most innovative technologies - to provide the best food experiences for our customers.
- Minimum of 2 years Experience in Google Big Query and Google Cloud Platform.
- Design and develop the ETL framework using BigQuery
- Expertise in Big Query concepts like Nested Queries, Clustering, Partitioning, etc.
- Working Experience of Clickstream database, Google Analytics/ Adobe Analytics.
- Should be able to automate the data load from Big Query using APIs or scripting language.
- Good experience in Advanced SQL concepts.
- Good experience with Adobe launch Web, Mobile & e-commerce tag implementation.
- Identify complex fuzzy problems, break them down in smaller parts, and implement creative, data-driven solutions
- Responsible for defining, analyzing, and communicating key metrics and business trends to the management teams
- Identify opportunities to improve conversion & user experience through data. Influence product & feature roadmaps.
- Must have a passion for data quality and be constantly looking to improve the system. Drive data-driven decision making through the stakeholders & drive Change Management
- Understand requirements to translate business problems & technical problems into analytics problems.
- Effective storyboarding and presentation of the solution to the client and leadership.
- Client engagement & management
- Ability to interface effectively with multiple levels of management and functional disciplines.
- Assist in developing/coaching individuals technically as well as on soft skills during the project and as part of Client Project’s training program.
- 2 to 3 years of working experience in Google Big Query & Google Cloud Platform
- Relevant experience in Consumer Tech/CPG/Retail industries
- Bachelor’s in engineering, Computer Science, Math, Statistics or related discipline
- Strong problem solving and web analytical skills. Acute attention to detail.
- Experience in analyzing large, complex, multi-dimensional data sets.
- Experience in one or more roles in an online eCommerce or online support environment.
- Expertise in Google Big Query & Google Cloud Platform
- Experience in Advanced SQL, Scripting language (Python/R)
- Hands-on experience in BI tools (Tableau, Power BI)
- Working Experience & understanding of Adobe Analytics or Google Analytics
- Experience in creating and debugging website & app tracking (Omnibus, Dataslayer, GA debugger, etc.)
- Excellent analytical thinking, analysis, and problem-solving skills.
- Knowledge of other GCP services is a plus
Design, development and deployment of highly-available and fault-tolerant enterprise business software at scale.
Demonstrate tech expertise to go very deep or broad in solving classes of problems or creating broadly leverage-able solutions.
Execute large-scale projects - Provide technical leadership in architecting and building product solutions.
Collaborate across teams to deliver a result, from hardworking team members within your group, through smart technologists across lines of business.
Be a role model on acting with good judgment and responsibility, helping teams to commit and move forward.
Be a humble mentor and trusted advisor for both our talented team members and passionate leaders alike. Deal with differences in opinion in a mature and fair way.
Raise the bar by improving standard methodologies, producing best-in-class efficient solutions, code, documentation, testing, and monitoring.
• 15+ years of relevant engineering experience.
Proven record of building and productionizing highly reliable products at scale.
Experience with Java and Python
Experience with the Big Data technologie is a plus.
Ability to assess new technologies and make pragmatic choices that help guide us towards a long-term vision
Can collaborate well with several other engineering orgs to articulate requirements and system design
• Team player!
• Great interpersonal skills, deep technical ability, and a portfolio of successful execution.
• Excellent written and verbal communication skills, including the ability to write detailed technical documents.
• Passionate about helping teams grow by inspiring and mentoring engineers.
Desired Candidate Profile
1. Good knowledge of MYSQL architecture
2. Knowledge on MYSQL replication Master-Master and Master slave, Galera cluster and troubleshooting
3. Must have knowledge of setting MYSQL clustering , tuning, troubleshooting
4. Must have good knowledge of Performance tuning of MYSQL databases.
5. Must have good knowledge of MYSQL database up-gradation
6. Installation and configuration of MYSQL on Linux
7. Understanding MYSQL Backup & Recovery.
8. Ability to multi-task and context-switch effectively between different activities and teams
9. Provide 24x7 support for critical production systems.
10. Excellent written and verbal communication.
11. Ability to organize and plan work independently.
12. Ability to work in a rapidly changing environment.
JD for IOT DE:
The role requires experience in Azure core technologies – IoT Hub/ Event Hub, Stream Analytics, IoT Central, Azure Data Lake Storage, Azure Cosmos, Azure Data Factory, Azure SQL Database, Azure HDInsight / Databricks, SQL data warehouse.
- Minimum 2 years of software development experience
- Minimum 2 years of experience in IoT/streaming data pipelines solution development
- Bachelor's and/or Master’s degree in computer science
- Strong Consulting skills in data management including data governance, data quality, security, data integration, processing, and provisioning
- Delivered data management projects with real-time/near real-time data insights delivery on Azure Cloud
- Translated complex analytical requirements into the technical design including data models, ETLs, and Dashboards / Reports
- Experience deploying dashboards and self-service analytics solutions on both relational and non-relational databases
- Experience with different computing paradigms in databases such as In-Memory, Distributed, Massively Parallel Processing
- Successfully delivered large scale IOT data management initiatives covering Plan, Design, Build and Deploy phases leveraging different delivery methodologies including Agile
- Experience in handling telemetry data with Spark Streaming, Kafka, Flink, Scala, Pyspark, Spark SQL.
- Hands-on experience on containers and Dockers
- Exposure to streaming protocols like MQTT and AMQP
- Knowledge of OT network protocols like OPC UA, CAN Bus, and similar protocols
- Strong knowledge of continuous integration, static code analysis, and test-driven development
- Experience in delivering projects in a highly collaborative delivery model with teams at onsite and offshore
- Must have excellent analytical and problem-solving skills
- Delivered change management initiatives focused on driving data platforms adoption across the enterprise
- Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations
Roles & Responsibilities
- Translate functional requirements into technical design
- Interact with clients and internal stakeholders to understand the data and platform requirements in detail and determine core Azure services needed to fulfill the technical design
- Design, Develop and Deliver data integration interfaces in ADF and Azure Databricks
- Design, Develop and Deliver data provisioning interfaces to fulfill consumption needs
- Deliver data models on Azure platform, it could be on Azure Cosmos, SQL DW / Synapse, or SQL
- Advise clients on ML Engineering and deploying ML Ops at Scale on AKS
- Automate core activities to minimize the delivery lead times and improve the overall quality
- Optimize platform cost by selecting the right platform services and architecting the solution in a cost-effective manner
- Deploy Azure DevOps and CI CD processes
- Deploy logging and monitoring across the different integration points for critical alerts
This will include:
The verticals included are: