Senior Data Analyst

at Rebel Foods

DP
Posted by Ankit Suman
icon
Mumbai
icon
6 - 10 yrs
icon
₹16L - ₹28L / yr (ESOP available)
icon
Full time
Skills
Data Analytics
NOSQL Databases
Python
NumPy
pandas
Data Visualization
SQL

About Rebel Foods:

World's leading consumer companies are all technology / new age companies - Amazon (retail), Airbnb (Hospitality), Uber (mobility), Netflix / Spotify (Entertainment). The only sector, where traditional companies are still the largest ones is restaurants - McDonald's (with a market cap of 130 BN USD). With Food Delivery growing exponentially worldwide, there is an opportunity to build the world's most valuable restaurant company on the internet, superfast. We have the formula to be that company. Today, we can safely say we are world's largest delivery only / internet restaurant company, and by a wide margin with 4000+ individual internet restaurants, in 40+ cities and 7 countries (India, Indonesia, UAE, UK, Malaysia, Singapore, Bangladesh. It's still Day 1, but we know we are onto something very, very big.

 

We have a once-in-a-lifetime opportunity to change the 500-year-old industry that hasn’t been disrupted at its core by technology. For more details on how we are changing the restaurant industry from the core, please refer below. It's important reading if you want to know our company better and really explore working with us:

 

https://medium.com/@jaydeep_barman/why-is-rebel-foods-hiring-super-talented-engineers%20b88586223ebe" target="_blank">link


https://medium.com/@jaydeep_barman/how-to-build-1000-restaurants-in-24-months-the-rebel-method%20cb5b0cea4dc8" target="_blank">link 1


https://medium.com/faasos-story/winning-the-last-frontier-for-consumer-internet-5f2a659c43db%20https://medium.com/faasos-story/a-unique-take-on-food-tech-dcef8c51ba41" target="_blank">link 2


https://medium.com/faasos-story/a-unique-take-on-food-tech-dcef8c51ba41" target="_blank">link 4


The Role

  • Understanding Business and Data Requirements from stakeholders
  • Creating Business Reports
  • Report Automation
  • Creating Dashboards/Visualizations for Business KPIs
  • Data Mining for Business Insights
  • Initiatives based on business needs and requirements
  • Evaluating business processes, uncovering areas of improvement, optimizing strategies and implementing solutions
  • Problem solving skills

 

Requirements:

  • Acquire, aggregate, restructure, and analyse large datasets.
  • Ability to work with various SQL and No-SQL data sources, S3, APIs etc.
  • Data manipulation experience with Python/Pandas
  • Exhibit strong analytic, technical, trouble-shooting, and problem-solving skills
  • Ability to work in a team-oriented, fast-paced environment managing multiple
    priorities
  • Project management and organizational skills

 

 

Unique Opportunity

  • Get a chance to work on interesting Data Scientists/Machine Learning problems in the areas of Computer Vision, Natural Language Processing, and Time-Series Forecasting
  • Get a chance to work on Deep Analytics systems built on large amounts of multi-source data using advanced Data Engineering

 

Languages - SQL, Python (Pandas)

BI Tools - Tableau/Power BI/Quicksight

 

The Rebel Culture

We believe in empowering and growing people to perform the best at their job functions. We follow outcome-oriented, fail-fast iterative & collaborative culture to move fast in building tech solutions.

Rebel is not a usual workplace. The following slides will give you a sense of our culture, how Rebel conducts itself and who will be the best fit for our company. We suggest you go through it before making up your mind.

https://www.slideshare.net/JaydeepBarman/culture-rebel" target="_blank">link 5

 

About Rebel Foods

At Rebel Foods, we are challenging this status quo as we are building the world's most valuable restaurant company on the internet, superfast. The opportunity for us is immense due to the exponential growth in the food delivery business worldwide which has helped us build 'The World's Largest Internet Restaurant Company' in the last few years. Rebel Foods current presence in 7 countries (India, Indonesia, UAE, UK, Malaysia, Singapore, Bangladesh) with 15 + brands and 3500+ internet restaurants has been built on a simple system - The Rebel Operating Model. While for us it is still Day 1, we know we are in the middle of a revolution towards creating never seen before customer-first experiences. We bring you a once-in-a-lifetime opportunity to disrupt the 500-yearold industry with technology at its core.


Here, at Rebel Foods, we are using technology and automation to disrupt the traditional food industry. We are focused on building an operating system for Cloud Kitchens - using the most innovative technologies - to provide the best food experiences for our customers. 

Founded
2011
Type
Product
Size
500-1000 employees
Stage
Raised funding
View full company details
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

Data Engineer - Sr.

at Leading Fleet Mgmt. Platform

Agency job
via Qrata
Data engineering
Apache Kafka
Spark
data engineer
Big Data
Python
Distributed Systems
icon
Remote only
icon
4 - 8 yrs
icon
₹20L - ₹45L / yr
Required Skills
Experience with various stream processing and batch processing tools (Kafka,
Spark etc). Programming with Python.
● Experience with relational and non-relational databases.
● Fairly good understanding of AWS (or any equivalent).


Key Responsibilities
● Design new systems and redesign existing systems to work at scale.
● Care about things like fault tolerance, durability, backups and recovery,
performance, maintainability, code simplicity etc.
● Lead a team of software engineers and help create an environment of ownership
and learning.
● Introduce best practices of software development and ensure their adoption
across the team.
● Help set and maintain coding standards for the team.
Job posted by
Blessy Fernandes

Associate - Tableau Developer

at Angel One

Founded 1987  •  Product  •  1000-5000 employees  •  Profitable
Tableau
SQL
MS-Excel
Data Analytics
Business Intelligence (BI)
Tableau Developer
Dashboarding
Reporting tools
Business Analytics
icon
Remote only
icon
3 - 5 yrs
icon
₹8L - ₹10L / yr

    Job description:

  • Design, develop and maintain complex Tableau reports for scalability, manageability, extensibility, performance, and re-use
  • Work with team members to create useful reports and dashboards that provide insight, improve/automate processes, or otherwise add value to the team
  • Write complex SQL queries to automate dashboards
  • Implement tools and strategies to translate raw data into valuable business insights
  • Identifying patterns and trends in data sets
  • Working alongside teams to establish business needs
  • Provide recommendations to update current MIS to improve reporting efficiency and consistency

Requirement / Desired Skills:

  • Solid experience in dashboarding and reporting; industry experience is a plus
  • Knowledge of Excel and SQL; expertise with business intelligence tools
  • Ability to analyse and interpret data.
  • Problem-solving skills.
  • Methodical and logical approach.
  • Accuracy and attention to detail
  • Willingness to learn and adapt to new technologies

 

Job posted by
Sailee Dorle

Data Analyst

at BitClass

Founded 2020  •  Products & Services  •  20-100 employees  •  Raised funding
Data Analytics
Statistical Analysis
data analyst
SQL
Python
icon
Bengaluru (Bangalore)
icon
1 - 4 yrs
icon
₹8L - ₹16L / yr
BitClass, a VC funded startup (backed by the investors or Unacademy, Doubtnut, ShareChat, Meesho among others) in the domain of edtech, is looking for a data analyst to help in making better business decisions using information from the available data. The responsibility is to gather and prepare data from multiple sources, run statistical analyses, and communicate your findings in a clear and objective way.

Responsibilities

- Understanding the business requirements so as to formulate the problems to solve and restrict the slice of data to be explored.
- Collecting data from various sources.
- Performing cleansing, processing, and validation on the data subject to analyze, in order to ensure its quality.
- Exploring and visualizing data.
- Performing statistical analysis and experiments to derive business insights.
- Clearly communicating the findings from the analysis to turn information into something actionable through reports, dashboards, and/or presentations.

Skills

- Experience solving problems in the project’s business domain.
- Experience with data integration from multiple sources 
- Proficiency in at least one query language, especially SQL.
- Working experience with NoSQL databases, such as MongoDB and Elasticsearch.
- Working experience with popular statistical and machine learning techniques, such as clustering, linear regression, KNN, decision trees, etc.
- Good scripting skills using Python, R or any other relevant language
- Proficiency in at least one data visualization tool, such as Matplotlib, Plotly, D3.js, ggplot, etc.
- Great communication skills.
Job posted by
Utsav Tiwary

Data Engineer

at Indium Software

Founded 1999  •  Services  •  100-1000 employees  •  Profitable
SQL
Python
Hadoop
HiveQL
Spark
PySpark
icon
Bengaluru (Bangalore), Hyderabad
icon
1 - 9 yrs
icon
₹1L - ₹15L / yr

Responsibilities:

 

* 3+ years of Data Engineering Experience - Design, develop, deliver and maintain data infrastructures.

SQL Specialist – Strong knowledge and Seasoned experience with SQL Queries

Languages: Python

* Good communicator, shows initiative, works well with stakeholders.

* Experience working closely with Data Analysts and provide the data they need and guide them on the issues.

* Solid ETL experience and Hadoop/Hive/Pyspark/Presto/ SparkSQL

* Solid communication and articulation skills

* Able to handle stakeholders independently with less interventions of reporting manager.

* Develop strategies to solve problems in logical yet creative ways.

* Create custom reports and presentations accompanied by strong data visualization and storytelling

 

We would be excited if you have:

 

* Excellent communication and interpersonal skills

* Ability to meet deadlines and manage project delivery

* Excellent report-writing and presentation skills

* Critical thinking and problem-solving capabilities

Job posted by
Karunya P

Data Engineer

at TIGI HR Solution Pvt. Ltd.

Founded 2014  •  Services  •  employees  •  Profitable
Data engineering
Hadoop
Big Data
Python
SQL
Amazon Web Services (AWS)
Windows Azure
icon
Mumbai, Bengaluru (Bangalore), Pune, Hyderabad, Noida
icon
2 - 5 yrs
icon
₹10L - ₹17L / yr
Position : Data Engineer
Employee Strength : around 600 in all over India
Working days: 5 days
Working Time: Flexible
Salary : 30-40% Hike on Current CTC
As of now work from home.
 
Job description:
  • Design, implement and support an analytical data infrastructure, providing ad hoc access to large data sets and computing power.
  • Contribute to development of standards and the design and implementation of proactive processes to collect and report data and statistics on assigned systems.
  • Research opportunities for data acquisition and new uses for existing data.
  • Provide technical development expertise for designing, coding, testing, debugging, documenting and supporting data solutions.
  • Experience building data pipelines to connect analytics stacks, client data visualization tools and external data sources.
  • Experience with cloud and distributed systems principles
  • Experience with Azure/AWS/GCP cloud infrastructure
  • Experience with Databricks Clusters and Configuration
  • Experience with Python, R, sh/bash and JVM-based languages including Scala and Java.
  • Experience with Hadoop family languages including Pig and Hive.
Job posted by
Rutu Lakhani

Credit Risk Analyst

at Niro

Founded 2021  •  Product  •  20-100 employees  •  Raised funding
Risk assessment
Risk Management
Risk analysis
Python
SAS
SPSS
R
icon
Bengaluru (Bangalore)
icon
2 - 4 yrs
icon
₹7L - ₹15L / yr
  • Gather information from multiple data sources make Approval Decisions mechanically
  • Read and interpret credit related information to the borrowers
  • Interpret, analyze and assess all forms of complex information
  • Embark on risk assessment analysis
  • Maintain the credit exposure of the company within certain risk level with set limit in mind
  • Build strategies to minimize risk and increase approval rates
  • Design Champion and Challenger tests, implement and read test results
  • Build Line assignment strategies
Skills required:
- Credit Risk Modeling
- Statistical Data Understanding and interpretation
- Basic Regression and Advanced Machine Learning Models
- Conversant with coding on Python using libraries like Sklearn etc.
- Build and understand decision trees
Job posted by
Vinay Gurram
PySpark
SQL
Data Warehouse (DWH)
ETL
icon
Remote, Bengaluru (Bangalore)
icon
5 - 8 yrs
icon
₹12L - ₹20L / yr
SQL Developer with Relevant experience of 7 Yrs with Strong Communication Skills.
 
Key responsibilities:
 
  • Creating, designing and developing data models
  • Prepare plans for all ETL (Extract/Transformation/Load) procedures and architectures
  • Validating results and creating business reports
  • Monitoring and tuning data loads and queries
  • Develop and prepare a schedule for a new data warehouse
  • Analyze large databases and recommend appropriate optimization for the same
  • Administer all requirements and design various functional specifications for data
  • Provide support to the Software Development Life cycle
  • Prepare various code designs and ensure efficient implementation of the same
  • Evaluate all codes and ensure the quality of all project deliverables
  • Monitor data warehouse work and provide subject matter expertise
  • Hands-on BI practices, data structures, data modeling, SQL skills
  • Minimum 1 year experience in Pyspark
Job posted by
Priyanka U

ETL developer

at fintech

Agency job
via Talentojcom
ETL
Druid Database
Java
Scala
SQL
Tableau
Python
icon
Remote only
icon
2 - 6 yrs
icon
₹9L - ₹30L / yr
● Education in a science, technology, engineering, or mathematics discipline, preferably a
bachelor’s degree or equivalent experience
● Knowledge of database fundamentals and fluency in advanced SQL, including concepts
such as windowing functions
● Knowledge of popular scripting languages for data processing such as Python, as well as
familiarity with common frameworks such as Pandas
● Experience building streaming ETL pipelines with tools such as Apache Flink, Apache
Beam, Google Cloud Dataflow, DBT and equivalents
● Experience building batch ETL pipelines with tools such as Apache Airflow, Spark, DBT, or
custom scripts
● Experience working with messaging systems such as Apache Kafka (and hosted
equivalents such as Amazon MSK), Apache Pulsar
● Familiarity with BI applications such as Tableau, Looker, or Superset
● Hands on coding experience in Java or Scala
Job posted by
Raksha Pant

ETL Engineer - Data Pipeline

at DataToBiz

Founded 2018  •  Services  •  20-100 employees  •  Bootstrapped
ETL
Amazon Web Services (AWS)
Amazon Redshift
Python
icon
Chandigarh, NCR (Delhi | Gurgaon | Noida)
icon
2 - 6 yrs
icon
₹7L - ₹15L / yr
Job Responsibilities : - Developing new data pipelines and ETL jobs for processing millions of records and it should be scalable with growth.
Pipelines should be optimised to handle both real time data, batch update data and historical data.
Establish scalable, efficient, automated processes for complex, large scale data analysis.
Write high quality code to gather and manage large data sets (both real time and batch data) from multiple sources, perform ETL and store it in a data warehouse.
Manipulate and analyse complex, high-volume, high-dimensional data from varying sources using a variety of tools and data analysis techniques.
Participate in data pipelines health monitoring and performance optimisations as well as quality documentation.
Interact with end users/clients and translate business language into technical requirements.
Acts independently to expose and resolve problems.

Job Requirements :-
2+ years experience working in software development & data pipeline development for enterprise analytics.
2+ years of working with Python with exposure to various warehousing tools
In-depth working with any of commercial tools like AWS Glue, Ta-lend, Informatica, Data-stage, etc.
Experience with various relational databases like MySQL, MSSql, Oracle etc. is a must.
Experience with analytics and reporting tools (Tableau, Power BI, SSRS, SSAS).
Experience in various DevOps practices helping the client to deploy and scale the systems as per requirement.
Strong verbal and written communication skills with other developers and business client.
Knowledge of Logistics and/or Transportation Domain is a plus.
Hands-on with traditional databases and ERP systems like Sybase and People-soft.
Job posted by
PS Dhillon

Technical Architect

at E-Commerce Product Based Company

Agency job
via myApps Solutions
Technical Architecture
Big Data
IT Solutioning
Python
Rest API
icon
Bengaluru (Bangalore)
icon
8 - 15 yrs
icon
₹15L - ₹30L / yr

Role and Responsibilities

  • Build a low latency serving layer that powers DataWeave's Dashboards, Reports, and Analytics functionality
  • Build robust RESTful APIs that serve data and insights to DataWeave and other products
  • Design user interaction workflows on our products and integrating them with data APIs
  • Help stabilize and scale our existing systems. Help design the next generation systems.
  • Scale our back end data and analytics pipeline to handle increasingly large amounts of data.
  • Work closely with the Head of Products and UX designers to understand the product vision and design philosophy
  • Lead/be a part of all major tech decisions. Bring in best practices. Mentor younger team members and interns.
  • Constantly think scale, think automation. Measure everything. Optimize proactively.
  • Be a tech thought leader. Add passion and vibrance to the team. Push the envelope.

 

Skills and Requirements

  • 8- 15 years of experience building and scaling APIs and web applications.
  • Experience building and managing large scale data/analytics systems.
  • Have a strong grasp of CS fundamentals and excellent problem solving abilities. Have a good understanding of software design principles and architectural best practices.
  • Be passionate about writing code and have experience coding in multiple languages, including at least one scripting language, preferably Python.
  • Be able to argue convincingly why feature X of language Y rocks/sucks, or why a certain design decision is right/wrong, and so on.
  • Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
  • Have experience working with multiple storage and indexing technologies such as MySQL, Redis, MongoDB, Cassandra, Elastic.
  • Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ.
  • Use the command line like a pro. Be proficient in Git and other essential software development tools.
  • Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
  • Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana, Graylog, StatsD, Datadog etc.
  • Working knowledge of building websites and apps. Good understanding of integration complexities and dependencies.
  • Working knowledge linux server administration as well as the AWS ecosystem is desirable.
  • It's a huge bonus if you have some personal projects (including open source contributions) that you work on during your spare time. Show off some of your projects you have hosted on GitHub.
Job posted by
BasavRaj P S
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at Rebel Foods?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort