mirror of
https://github.com/ClickHouse/ClickHouse.git
synced 2024-12-17 20:02:05 +00:00
f315e5079b
* replace exit with assert in test_single_page * improve save_raw_single_page docs option * More grammar fixes * "Built from" link in new tab * fix mistype * Example of include in docs * add anchor to meeting form * Draft of translation helper * WIP on translation helper * Replace some fa docs content with machine translation * add normalize-en-markdown.sh * normalize some en markdown * normalize some en markdown * admonition support * normalize * normalize * normalize * support wide tables * normalize * normalize * normalize * normalize * normalize * normalize * normalize * normalize * normalize * normalize * normalize * normalize * normalize * lightly edited machine translation of introdpection.md * lightly edited machhine translation of lazy.md * WIP on translation utils * Normalize ru docs * Normalize other languages * some fixes * WIP on normalize/translate tools * add requirements.txt * [experimental] add es docs language as machine translated draft * remove duplicate script * Back to wider tab-stop (narrow renders not so well) * Links to nowhere check at least for English * use f string * More complete es translation
29 lines
1.3 KiB
Markdown
29 lines
1.3 KiB
Markdown
# Nombre de la red inalámbrica (SSID): {#wikistat}
|
|
|
|
Ver: http://dumps.wikimedia.org/other/pagecounts-raw/
|
|
|
|
Creación de una tabla:
|
|
|
|
``` sql
|
|
CREATE TABLE wikistat
|
|
(
|
|
date Date,
|
|
time DateTime,
|
|
project String,
|
|
subproject String,
|
|
path String,
|
|
hits UInt64,
|
|
size UInt64
|
|
) ENGINE = MergeTree(date, (path, time), 8192);
|
|
```
|
|
|
|
Carga de datos:
|
|
|
|
``` bash
|
|
$ for i in {2007..2016}; do for j in {01..12}; do echo $i-$j >&2; curl -sSL "http://dumps.wikimedia.org/other/pagecounts-raw/$i/$i-$j/" | grep -oE 'pagecounts-[0-9]+-[0-9]+\.gz'; done; done | sort | uniq | tee links.txt
|
|
$ cat links.txt | while read link; do wget http://dumps.wikimedia.org/other/pagecounts-raw/$(echo $link | sed -r 's/pagecounts-([0-9]{4})([0-9]{2})[0-9]{2}-[0-9]+\.gz/\1/')/$(echo $link | sed -r 's/pagecounts-([0-9]{4})([0-9]{2})[0-9]{2}-[0-9]+\.gz/\1-\2/')/$link; done
|
|
$ ls -1 /opt/wikistat/ | grep gz | while read i; do echo $i; gzip -cd /opt/wikistat/$i | ./wikistat-loader --time="$(echo -n $i | sed -r 's/pagecounts-([0-9]{4})([0-9]{2})([0-9]{2})-([0-9]{2})([0-9]{2})([0-9]{2})\.gz/\1-\2-\3 \4-00-00/')" | clickhouse-client --query="INSERT INTO wikistat FORMAT TabSeparated"; done
|
|
```
|
|
|
|
[Artículo Original](https://clickhouse.tech/docs/es/getting_started/example_datasets/wikistat/) <!--hide-->
|