ClickHouse/docs/en/sql-reference/aggregate-functions/reference/sparkbar.md

Ignoring revisions in .git-blame-ignore-revs. Click here to bypass and see the normal blame view.

63 lines
2.3 KiB
Markdown
Raw Normal View History

2021-12-05 16:46:24 +00:00
---
2023-02-03 17:23:10 +00:00
slug: /en/sql-reference/aggregate-functions/reference/sparkbar
sidebar_position: 311
sidebar_label: sparkbar
2021-12-05 16:46:24 +00:00
---
2022-06-02 10:55:18 +00:00
# sparkbar
2021-12-05 16:46:24 +00:00
The function plots a frequency histogram for values `x` and the repetition rate `y` of these values over the interval `[min_x, max_x]`.
2023-02-03 17:23:10 +00:00
Repetitions for all `x` falling into the same bucket are averaged, so data should be pre-aggregated.
Negative repetitions are ignored.
2023-02-03 17:23:10 +00:00
If no interval is specified, then the minimum `x` is used as the interval start, and the maximum `x` — as the interval end.
Otherwise, values outside the interval are ignored.
2021-12-05 16:46:24 +00:00
**Syntax**
``` sql
sparkbar(buckets[, min_x, max_x])(x, y)
2021-12-05 16:46:24 +00:00
```
**Parameters**
- `buckets` — The number of segments. Type: [Integer](../../../sql-reference/data-types/int-uint.md).
- `min_x` — The interval start. Optional parameter.
- `max_x` — The interval end. Optional parameter.
2021-12-05 16:46:24 +00:00
**Arguments**
- `x` — The field with values.
- `y` — The field with the frequency of values.
2021-12-05 16:46:24 +00:00
**Returned value**
- The frequency histogram.
2021-12-05 16:46:24 +00:00
**Example**
Query:
``` sql
2023-02-03 17:23:10 +00:00
CREATE TABLE spark_bar_data (`value` Int64, `event_date` Date) ENGINE = MergeTree ORDER BY event_date;
INSERT INTO spark_bar_data VALUES (1,'2020-01-01'), (3,'2020-01-02'), (4,'2020-01-02'), (-3,'2020-01-02'), (5,'2020-01-03'), (2,'2020-01-04'), (3,'2020-01-05'), (7,'2020-01-06'), (6,'2020-01-07'), (8,'2020-01-08'), (2,'2020-01-11');
2021-12-05 16:46:24 +00:00
2023-02-03 17:23:10 +00:00
SELECT sparkbar(9)(event_date,cnt) FROM (SELECT sum(value) as cnt, event_date FROM spark_bar_data GROUP BY event_date);
2021-12-05 16:46:24 +00:00
2023-02-03 17:23:10 +00:00
SELECT sparkbar(9, toDate('2020-01-01'), toDate('2020-01-10'))(event_date,cnt) FROM (SELECT sum(value) as cnt, event_date FROM spark_bar_data GROUP BY event_date);
2021-12-05 16:46:24 +00:00
```
Result:
``` text
┌─sparkbar(9)(event_date, cnt)─┐
2023-02-03 17:23:10 +00:00
│ ▂▅▂▃▆█ ▂ │
2021-12-05 16:46:24 +00:00
└──────────────────────────────┘
┌─sparkbar(9, toDate('2020-01-01'), toDate('2020-01-10'))(event_date, cnt)─┐
2023-02-03 17:23:10 +00:00
│ ▂▅▂▃▇▆█ │
2021-12-05 16:46:24 +00:00
└──────────────────────────────────────────────────────────────────────────┘
```