tests: fix 01281_group_by_limit_memory_tracking flakiness (#41178)

* tests: fix 01281_group_by_limit_memory_tracking flakiness

CI: https://s3.amazonaws.com/clickhouse-test-reports/41092/14d8f297d73f9f813c447474310fbfa0c5b18a0f/stateless_tests__debug__[2/3].html
Signed-off-by: Azat Khuzhin <a.khuzhin@semrush.com>

* tests: add missing bits into 01281_group_by_limit_memory_tracking

While rewriting the test in #11119 it looses LIMIT, which makes the test
useless, see details in #11022

Signed-off-by: Azat Khuzhin <a.khuzhin@semrush.com>

Signed-off-by: Azat Khuzhin <a.khuzhin@semrush.com>
This commit is contained in:
Azat Khuzhin 2022-09-11 14:57:36 +03:00 committed by GitHub
parent 5802c2fdd2
commit 13f7a82a5b
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -9,6 +9,11 @@
# - two-level group by
# - max_memory_usage_for_user
# - one users' query in background (to avoid reseting max_memory_usage_for_user)
# - query with limit (to not consume all the rows)
#
# For details see [1].
#
# [1]: https://github.com/ClickHouse/ClickHouse/pull/11022
CURDIR=$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)
# shellcheck source=../shell_config.sh
@ -23,19 +28,22 @@ function execute_null()
function execute_group_by()
{
# Peak memory usage for the main query (with GROUP BY) is ~100MiB (with
# max_threads=2 as here).
# So set max_memory_usage_for_user to 150MiB and if the memory tracking
# Peak memory usage for the main query (with GROUP BY) is ~84MiB (with options here).
# So set max_memory_usage_for_user to 120MiB and if the memory tracking
# accounting will be incorrect then the second query will fail
#
# Note that we also need one running query for the user (sleep(3)), since
# max_memory_usage_for_user is installed to 0 once there are no more
# queries for user.
local opts=(
"--max_memory_usage_for_user="$((200<<20))
"--max_threads=2"
"--max_memory_usage_for_user=$((120<<20))"
"--max_threads=1"
# this is to enable two level group by
# (using threads to enable it makes the query use non constant amount of memory)
"--max_bytes_before_external_group_by=$((1<<40))"
"--collect_hash_table_stats_during_aggregation=0"
)
execute_null "${opts[@]}" <<<'SELECT uniq(number) FROM numbers_mt(1e6) GROUP BY number % 5e5'
execute_null "${opts[@]}" <<<'SELECT uniq(number) FROM numbers_mt(1e6) GROUP BY number % 5e5 LIMIT 10'
}
# This is needed to keep at least one running query for user for the time of test.