2021-12-03 05:25:14 +00:00
---
2022-08-28 14:53:34 +00:00
slug: /en/sql-reference/table-functions/hdfsCluster
2023-06-23 13:16:22 +00:00
sidebar_position: 81
2022-04-09 13:29:05 +00:00
sidebar_label: hdfsCluster
2021-12-03 05:25:14 +00:00
---
2022-06-02 10:55:18 +00:00
# hdfsCluster Table Function
2021-12-03 05:25:14 +00:00
2023-05-04 16:35:18 +00:00
Allows processing files from HDFS in parallel from many nodes in a specified cluster. On initiator it creates a connection to all nodes in the cluster, discloses asterisks in HDFS file path, and dispatches each file dynamically. On the worker node it asks the initiator about the next task to process and processes it. This is repeated until all tasks are finished.
2021-12-03 05:25:14 +00:00
**Syntax**
``` sql
hdfsCluster(cluster_name, URI, format, structure)
```
**Arguments**
2023-04-19 15:55:29 +00:00
- `cluster_name` — Name of a cluster that is used to build a set of addresses and connection parameters to remote and local servers.
2023-08-09 12:30:33 +00:00
- `URI` — URI to a file or a bunch of files. Supports following wildcards in readonly mode: `*` , `**` , `?` , `{'abc','def'}` and `{N..M}` where `N` , `M` — numbers, `abc` , `def` — strings. For more information see [Wildcards In Path ](../../engines/table-engines/integrations/s3.md#wildcards-in-path ).
2023-04-19 15:55:29 +00:00
- `format` — The [format ](../../interfaces/formats.md#formats ) of the file.
- `structure` — Structure of the table. Format `'column1_name column1_type, column2_name column2_type, ...'` .
2021-12-03 05:25:14 +00:00
**Returned value**
A table with the specified structure for reading data in the specified file.
**Examples**
1. Suppose that we have a ClickHouse cluster named `cluster_simple` , and several files with following URIs on HDFS:
2023-04-19 15:55:29 +00:00
- ‘ hdfs://hdfs1:9000/some_dir/some_file_1’
- ‘ hdfs://hdfs1:9000/some_dir/some_file_2’
- ‘ hdfs://hdfs1:9000/some_dir/some_file_3’
- ‘ hdfs://hdfs1:9000/another_dir/some_file_1’
- ‘ hdfs://hdfs1:9000/another_dir/some_file_2’
- ‘ hdfs://hdfs1:9000/another_dir/some_file_3’
2021-12-03 05:25:14 +00:00
2. Query the amount of rows in these files:
``` sql
SELECT count(*)
FROM hdfsCluster('cluster_simple', 'hdfs://hdfs1:9000/{some,another}_dir/some_file_{1..3}', 'TSV', 'name String, value UInt32')
```
3. Query the amount of rows in all files of these two directories:
``` sql
SELECT count(*)
FROM hdfsCluster('cluster_simple', 'hdfs://hdfs1:9000/{some,another}_dir/*', 'TSV', 'name String, value UInt32')
```
2023-06-23 13:16:22 +00:00
:::note
2022-04-09 13:29:05 +00:00
If your listing of files contains number ranges with leading zeros, use the construction with braces for each digit separately or use `?` .
:::
2021-12-03 05:25:14 +00:00
**See Also**
2023-04-19 15:55:29 +00:00
- [HDFS engine ](../../engines/table-engines/integrations/hdfs.md )
- [HDFS table function ](../../sql-reference/table-functions/hdfs.md )