ClickHouse/website/blog/en/2019/how-to-speed-up-lz4-decompression-in-clickhouse.md
Ivan Blinkov 9a26d48ad0
Basic blog similar to docs (#11609)
* Basic blog similar to docs

* rename post

* no post content in post_meta

* update readme and template

* more "en" content

* complete "en" content

* build blog redirects

* redirects for migration

* link sitemaps

* update po

* add "ru" content

* ru redirects

* remove old domain mentions

* adjust styles

* content improvements

* +1 alt

* use main images from CDN

* use re-hosted in-content images

* extra vertical margin around embedded youtube

* minor improvements

* adjust post page

* adjust html meta

* adjust post page

* improve blog rendering
2020-06-16 09:31:00 +03:00

13 lines
913 B
Markdown

---
title: 'How to speed up LZ4 decompression in ClickHouse?'
image: 'https://blog-images.clickhouse.tech/en/2019/how-to-speed-up-lz4-decompression-in-clickhouse/main.jpg'
date: '2019-06-25'
tags: ['performance', 'lz4', 'article', 'decompression']
---
When you run queries in [ClickHouse](https://clickhouse.tech/), you might notice that the profiler often shows the `LZ_decompress_fast` function near the top. What is going on? This question had us wondering how to choose the best compression algorithm.
ClickHouse stores data in compressed form. When running queries, ClickHouse tries to do as little as possible, in order to conserve CPU resources. In many cases, all the potentially time-consuming computations are already well optimized, plus the user wrote a well thought-out query. Then all that's left to do is to perform decompression.
[Read further](https://habr.com/en/company/yandex/blog/457612/)