ClickHouse/docs/en/getting-started/example-datasets/menus.md

Ignoring revisions in .git-blame-ignore-revs. Click here to bypass and see the normal blame view.

358 lines
18 KiB
Markdown
Raw Normal View History

2021-08-09 17:47:52 +00:00
---
2022-08-28 14:53:34 +00:00
slug: /en/getting-started/example-datasets/menus
sidebar_label: New York Public Library "What's on the Menu?" Dataset
2022-08-29 16:19:50 +00:00
title: "New York Public Library \"What's on the Menu?\" Dataset"
2021-08-09 17:47:52 +00:00
---
The dataset is created by the New York Public Library. It contains historical data on the menus of hotels, restaurants and cafes with the dishes along with their prices.
Source: http://menus.nypl.org/data
The data is in public domain.
2021-08-09 17:49:38 +00:00
The data is from library's archive and it may be incomplete and difficult for statistical analysis. Nevertheless it is also very yummy.
2021-09-14 18:03:05 +00:00
The size is just 1.3 million records about dishes in the menus — it's a very small data volume for ClickHouse, but it's still a good example.
2021-08-09 17:47:52 +00:00
2021-09-16 14:02:36 +00:00
## Download the Dataset {#download-dataset}
2021-08-09 17:47:52 +00:00
2021-09-14 18:03:05 +00:00
Run the command:
```bash
2021-08-09 17:47:52 +00:00
wget https://s3.amazonaws.com/menusdata.nypl.org/gzips/2021_08_01_07_01_17_data.tgz
# Option: Validate the checksum
md5sum 2021_08_01_07_01_17_data.tgz
# Checksum should be equal to: db6126724de939a5481e3160a2d67d15
2021-08-09 17:47:52 +00:00
```
Replace the link to the up to date link from http://menus.nypl.org/data if needed.
Download size is about 35 MB.
2021-09-16 14:02:36 +00:00
## Unpack the Dataset {#unpack-dataset}
2021-08-09 17:47:52 +00:00
```bash
2021-08-09 17:47:52 +00:00
tar xvf 2021_08_01_07_01_17_data.tgz
```
Uncompressed size is about 150 MB.
The data is normalized consisted of four tables:
2021-09-16 15:01:14 +00:00
- `Menu` — Information about menus: the name of the restaurant, the date when menu was seen, etc.
- `Dish` — Information about dishes: the name of the dish along with some characteristic.
- `MenuPage` — Information about the pages in the menus, because every page belongs to some menu.
- `MenuItem` — An item of the menu. A dish along with its price on some menu page: links to dish and menu page.
2021-08-09 17:47:52 +00:00
2021-09-16 14:02:36 +00:00
## Create the Tables {#create-tables}
2021-08-09 17:47:52 +00:00
2022-04-11 05:01:34 +00:00
We use [Decimal](../../sql-reference/data-types/decimal.md) data type to store prices.
2021-09-16 15:01:14 +00:00
```sql
2021-08-09 17:47:52 +00:00
CREATE TABLE dish
(
id UInt32,
name String,
description String,
menus_appeared UInt32,
times_appeared Int32,
first_appeared UInt16,
last_appeared UInt16,
lowest_price Decimal64(3),
highest_price Decimal64(3)
) ENGINE = MergeTree ORDER BY id;
CREATE TABLE menu
(
id UInt32,
name String,
sponsor String,
event String,
venue String,
place String,
physical_description String,
occasion String,
notes String,
call_number String,
keywords String,
language String,
date String,
location String,
location_type String,
currency String,
currency_symbol String,
status String,
page_count UInt16,
dish_count UInt16
) ENGINE = MergeTree ORDER BY id;
CREATE TABLE menu_page
(
id UInt32,
menu_id UInt32,
page_number UInt16,
image_id String,
full_height UInt16,
full_width UInt16,
uuid UUID
) ENGINE = MergeTree ORDER BY id;
CREATE TABLE menu_item
(
id UInt32,
menu_page_id UInt32,
price Decimal64(3),
high_price Decimal64(3),
dish_id UInt32,
created_at DateTime,
updated_at DateTime,
xpos Float64,
ypos Float64
) ENGINE = MergeTree ORDER BY id;
```
2021-09-16 14:02:36 +00:00
## Import the Data {#import-data}
2021-08-09 17:47:52 +00:00
2021-09-16 15:01:14 +00:00
Upload data into ClickHouse, run:
2021-08-09 17:47:52 +00:00
```bash
2021-08-09 17:47:52 +00:00
clickhouse-client --format_csv_allow_single_quotes 0 --input_format_null_as_default 0 --query "INSERT INTO dish FORMAT CSVWithNames" < Dish.csv
clickhouse-client --format_csv_allow_single_quotes 0 --input_format_null_as_default 0 --query "INSERT INTO menu FORMAT CSVWithNames" < Menu.csv
clickhouse-client --format_csv_allow_single_quotes 0 --input_format_null_as_default 0 --query "INSERT INTO menu_page FORMAT CSVWithNames" < MenuPage.csv
clickhouse-client --format_csv_allow_single_quotes 0 --input_format_null_as_default 0 --date_time_input_format best_effort --query "INSERT INTO menu_item FORMAT CSVWithNames" < MenuItem.csv
```
2022-04-11 05:01:34 +00:00
We use [CSVWithNames](../../interfaces/formats.md#csvwithnames) format as the data is represented by CSV with header.
2021-08-09 17:47:52 +00:00
We disable `format_csv_allow_single_quotes` as only double quotes are used for data fields and single quotes can be inside the values and should not confuse the CSV parser.
2023-01-25 00:37:55 +00:00
We disable [input_format_null_as_default](../../operations/settings/settings-formats.md#settings-input-format-null-as-default) as our data does not have [NULL](../../sql-reference/syntax.md#null-literal). Otherwise ClickHouse will try to parse `\N` sequences and can be confused with `\` in data.
2021-08-09 17:47:52 +00:00
2023-01-25 00:37:55 +00:00
The setting [date_time_input_format best_effort](../../operations/settings/settings-formats.md#settings-date_time_input_format) allows to parse [DateTime](../../sql-reference/data-types/datetime.md) fields in wide variety of formats. For example, ISO-8601 without seconds like '2000-01-01 01:02' will be recognized. Without this setting only fixed DateTime format is allowed.
2021-08-09 17:47:52 +00:00
2021-09-16 14:02:36 +00:00
## Denormalize the Data {#denormalize-data}
2021-08-09 17:47:52 +00:00
2022-04-11 05:01:34 +00:00
Data is presented in multiple tables in [normalized form](https://en.wikipedia.org/wiki/Database_normalization#Normal_forms). It means you have to perform [JOIN](../../sql-reference/statements/select/join.md#select-join) if you want to query, e.g. dish names from menu items.
For typical analytical tasks it is way more efficient to deal with pre-JOINed data to avoid doing `JOIN` every time. It is called "denormalized" data.
2021-08-09 17:47:52 +00:00
2021-09-16 15:01:14 +00:00
We will create a table `menu_item_denorm` where will contain all the data JOINed together:
2021-08-09 17:47:52 +00:00
```sql
2021-08-09 17:47:52 +00:00
CREATE TABLE menu_item_denorm
ENGINE = MergeTree ORDER BY (dish_name, created_at)
AS SELECT
price,
high_price,
created_at,
updated_at,
xpos,
ypos,
dish.id AS dish_id,
dish.name AS dish_name,
dish.description AS dish_description,
dish.menus_appeared AS dish_menus_appeared,
dish.times_appeared AS dish_times_appeared,
dish.first_appeared AS dish_first_appeared,
dish.last_appeared AS dish_last_appeared,
dish.lowest_price AS dish_lowest_price,
dish.highest_price AS dish_highest_price,
menu.id AS menu_id,
menu.name AS menu_name,
menu.sponsor AS menu_sponsor,
menu.event AS menu_event,
menu.venue AS menu_venue,
menu.place AS menu_place,
menu.physical_description AS menu_physical_description,
menu.occasion AS menu_occasion,
menu.notes AS menu_notes,
menu.call_number AS menu_call_number,
menu.keywords AS menu_keywords,
menu.language AS menu_language,
menu.date AS menu_date,
menu.location AS menu_location,
menu.location_type AS menu_location_type,
menu.currency AS menu_currency,
menu.currency_symbol AS menu_currency_symbol,
menu.status AS menu_status,
menu.page_count AS menu_page_count,
menu.dish_count AS menu_dish_count
FROM menu_item
JOIN dish ON menu_item.dish_id = dish.id
JOIN menu_page ON menu_item.menu_page_id = menu_page.id
JOIN menu ON menu_page.menu_id = menu.id;
2021-08-09 17:47:52 +00:00
```
2021-09-16 14:02:36 +00:00
## Validate the Data {#validate-data}
2021-08-09 17:47:52 +00:00
Query:
```sql
SELECT count() FROM menu_item_denorm;
2021-08-09 17:47:52 +00:00
```
Result:
```text
┌─count()─┐
│ 1329175 │
└─────────┘
2021-08-09 17:47:52 +00:00
```
2021-09-16 14:02:36 +00:00
## Run Some Queries {#run-queries}
2021-08-09 17:47:52 +00:00
2021-09-16 14:02:36 +00:00
### Averaged historical prices of dishes {#query-averaged-historical-prices}
2021-08-09 17:47:52 +00:00
Query:
```sql
2021-08-09 17:47:52 +00:00
SELECT
round(toUInt32OrZero(extract(menu_date, '^\\d{4}')), -1) AS d,
count(),
round(avg(price), 2),
bar(avg(price), 0, 100, 100)
FROM menu_item_denorm
WHERE (menu_currency = 'Dollars') AND (d > 0) AND (d < 2022)
GROUP BY d
ORDER BY d ASC;
```
Result:
2021-08-09 17:47:52 +00:00
```text
2021-08-09 17:47:52 +00:00
┌────d─┬─count()─┬─round(avg(price), 2)─┬─bar(avg(price), 0, 100, 100)─┐
│ 1850 │ 618 │ 1.5 │ █▍ │
│ 1860 │ 1634 │ 1.29 │ █▎ │
│ 1870 │ 2215 │ 1.36 │ █▎ │
│ 1880 │ 3909 │ 1.01 │ █ │
│ 1890 │ 8837 │ 1.4 │ █▍ │
│ 1900 │ 176292 │ 0.68 │ ▋ │
│ 1910 │ 212196 │ 0.88 │ ▊ │
│ 1920 │ 179590 │ 0.74 │ ▋ │
│ 1930 │ 73707 │ 0.6 │ ▌ │
│ 1940 │ 58795 │ 0.57 │ ▌ │
│ 1950 │ 41407 │ 0.95 │ ▊ │
│ 1960 │ 51179 │ 1.32 │ █▎ │
│ 1970 │ 12914 │ 1.86 │ █▋ │
│ 1980 │ 7268 │ 4.35 │ ████▎ │
│ 1990 │ 11055 │ 6.03 │ ██████ │
│ 2000 │ 2467 │ 11.85 │ ███████████▋ │
│ 2010 │ 597 │ 25.66 │ █████████████████████████▋ │
└──────┴─────────┴──────────────────────┴──────────────────────────────┘
```
Take it with a grain of salt.
2021-09-16 14:02:36 +00:00
### Burger Prices {#query-burger-prices}
2021-08-09 17:47:52 +00:00
Query:
```sql
2021-08-09 17:47:52 +00:00
SELECT
round(toUInt32OrZero(extract(menu_date, '^\\d{4}')), -1) AS d,
count(),
round(avg(price), 2),
bar(avg(price), 0, 50, 100)
FROM menu_item_denorm
WHERE (menu_currency = 'Dollars') AND (d > 0) AND (d < 2022) AND (dish_name ILIKE '%burger%')
GROUP BY d
ORDER BY d ASC;
```
2021-08-09 17:47:52 +00:00
Result:
2021-08-09 17:47:52 +00:00
```text
2021-08-09 17:47:52 +00:00
┌────d─┬─count()─┬─round(avg(price), 2)─┬─bar(avg(price), 0, 50, 100)───────────┐
│ 1880 │ 2 │ 0.42 │ ▋ │
│ 1890 │ 7 │ 0.85 │ █▋ │
│ 1900 │ 399 │ 0.49 │ ▊ │
│ 1910 │ 589 │ 0.68 │ █▎ │
│ 1920 │ 280 │ 0.56 │ █ │
│ 1930 │ 74 │ 0.42 │ ▋ │
│ 1940 │ 119 │ 0.59 │ █▏ │
│ 1950 │ 134 │ 1.09 │ ██▏ │
│ 1960 │ 272 │ 0.92 │ █▋ │
│ 1970 │ 108 │ 1.18 │ ██▎ │
│ 1980 │ 88 │ 2.82 │ █████▋ │
│ 1990 │ 184 │ 3.68 │ ███████▎ │
│ 2000 │ 21 │ 7.14 │ ██████████████▎ │
│ 2010 │ 6 │ 18.42 │ ████████████████████████████████████▋ │
└──────┴─────────┴──────────────────────┴───────────────────────────────────────┘
```
2021-09-16 14:02:36 +00:00
### Vodka {#query-vodka}
2021-08-09 17:47:52 +00:00
Query:
```sql
2021-08-09 17:47:52 +00:00
SELECT
round(toUInt32OrZero(extract(menu_date, '^\\d{4}')), -1) AS d,
count(),
round(avg(price), 2),
bar(avg(price), 0, 50, 100)
FROM menu_item_denorm
WHERE (menu_currency IN ('Dollars', '')) AND (d > 0) AND (d < 2022) AND (dish_name ILIKE '%vodka%')
GROUP BY d
ORDER BY d ASC;
```
2021-08-09 17:47:52 +00:00
Result:
2021-08-09 17:47:52 +00:00
```text
2021-08-09 17:47:52 +00:00
┌────d─┬─count()─┬─round(avg(price), 2)─┬─bar(avg(price), 0, 50, 100)─┐
│ 1910 │ 2 │ 0 │ │
│ 1920 │ 1 │ 0.3 │ ▌ │
│ 1940 │ 21 │ 0.42 │ ▋ │
│ 1950 │ 14 │ 0.59 │ █▏ │
│ 1960 │ 113 │ 2.17 │ ████▎ │
│ 1970 │ 37 │ 0.68 │ █▎ │
│ 1980 │ 19 │ 2.55 │ █████ │
│ 1990 │ 86 │ 3.6 │ ███████▏ │
│ 2000 │ 2 │ 3.98 │ ███████▊ │
└──────┴─────────┴──────────────────────┴─────────────────────────────┘
```
2021-08-09 17:52:41 +00:00
To get vodka we have to write `ILIKE '%vodka%'` and this definitely makes a statement.
2021-09-16 14:02:36 +00:00
### Caviar {#query-caviar}
2021-08-09 17:47:52 +00:00
Let's print caviar prices. Also let's print a name of any dish with caviar.
Query:
```sql
2021-08-09 17:47:52 +00:00
SELECT
round(toUInt32OrZero(extract(menu_date, '^\\d{4}')), -1) AS d,
count(),
round(avg(price), 2),
bar(avg(price), 0, 50, 100),
any(dish_name)
FROM menu_item_denorm
WHERE (menu_currency IN ('Dollars', '')) AND (d > 0) AND (d < 2022) AND (dish_name ILIKE '%caviar%')
GROUP BY d
ORDER BY d ASC;
```
Result:
2021-08-09 17:47:52 +00:00
```text
2021-08-09 17:47:52 +00:00
┌────d─┬─count()─┬─round(avg(price), 2)─┬─bar(avg(price), 0, 50, 100)──────┬─any(dish_name)──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│ 1090 │ 1 │ 0 │ │ Caviar │
│ 1880 │ 3 │ 0 │ │ Caviar │
│ 1890 │ 39 │ 0.59 │ █▏ │ Butter and caviar │
│ 1900 │ 1014 │ 0.34 │ ▋ │ Anchovy Caviar on Toast │
│ 1910 │ 1588 │ 1.35 │ ██▋ │ 1/1 Brötchen Caviar │
│ 1920 │ 927 │ 1.37 │ ██▋ │ ASTRAKAN CAVIAR │
│ 1930 │ 289 │ 1.91 │ ███▋ │ Astrachan caviar │
│ 1940 │ 201 │ 0.83 │ █▋ │ (SPECIAL) Domestic Caviar Sandwich │
│ 1950 │ 81 │ 2.27 │ ████▌ │ Beluga Caviar │
│ 1960 │ 126 │ 2.21 │ ████▍ │ Beluga Caviar │
│ 1970 │ 105 │ 0.95 │ █▊ │ BELUGA MALOSSOL CAVIAR AMERICAN DRESSING │
│ 1980 │ 12 │ 7.22 │ ██████████████▍ │ Authentic Iranian Beluga Caviar the world's finest black caviar presented in ice garni and a sampling of chilled 100° Russian vodka │
│ 1990 │ 74 │ 14.42 │ ████████████████████████████▋ │ Avocado Salad, Fresh cut avocado with caviare │
│ 2000 │ 3 │ 7.82 │ ███████████████▋ │ Aufgeschlagenes Kartoffelsueppchen mit Forellencaviar │
│ 2010 │ 6 │ 15.58 │ ███████████████████████████████▏ │ "OYSTERS AND PEARLS" "Sabayon" of Pearl Tapioca with Island Creek Oysters and Russian Sevruga Caviar │
└──────┴─────────┴──────────────────────┴──────────────────────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
```
At least they have caviar with vodka. Very nice.
## Online Playground {#playground}
2021-08-09 17:47:52 +00:00
2024-10-23 18:17:50 +00:00
The data is uploaded to ClickHouse Playground, [example](https://sql.clickhouse.com?query_id=KB5KQJJFNBKHE5GBUJCP1B).