This is an automated email from the ASF dual-hosted git repository.
liaoxin pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/doris-website.git
The following commit(s) were added to refs/heads/master by this push:
new a665b9d5b6 Update load-manual.md (#1339)
a665b9d5b6 is described below
commit a665b9d5b63eb470ba6117c237d13c2747b6cdcc
Author: Ruffian Jiang <[email protected]>
AuthorDate: Tue Nov 19 18:45:34 2024 +0800
Update load-manual.md (#1339)
---
docs/data-operate/import/load-manual.md | 6 +++---
.../current/data-operate/import/load-manual.md | 4 ++--
.../version-2.1/data-operate/import/load-manual.md | 4 ++--
.../version-3.0/data-operate/import/load-manual.md | 4 ++--
versioned_docs/version-2.1/data-operate/import/load-manual.md | 6 +++---
versioned_docs/version-3.0/data-operate/import/load-manual.md | 6 +++---
6 files changed, 15 insertions(+), 15 deletions(-)
diff --git a/docs/data-operate/import/load-manual.md
b/docs/data-operate/import/load-manual.md
index 6dcf57e666..c7c34cdda8 100644
--- a/docs/data-operate/import/load-manual.md
+++ b/docs/data-operate/import/load-manual.md
@@ -32,7 +32,7 @@ Apache Doris offers various methods for importing and
integrating data, allowing
- For higher concurrency or frequency (more than 20 concurrent writes or
multiple writes per minute), you can enable enable [Group
Commit](./import-way/group-commit-manual.md) and use JDBC INSERT or Stream Load.
- - For high throughput, you can use [Stream
Load](./import-way/stream-load-manua) via HTTP.
+ - For high throughput, you can use [Stream
Load](./import-way/stream-load-manual) via HTTP.
- **Streaming Synchronization**: Real-time data streams (e.g., Flink, Kafka,
transactional databases) are imported into Doris tables, ideal for real-time
analysis and querying.
@@ -48,7 +48,7 @@ Apache Doris offers various methods for importing and
integrating data, allowing
- You can use [INSERT INTO SELECT](./import-way/insert-into-manual.md) to
synchronously load files from S3, HDFS, and NAS into Doris, and you can perform
the operation asynchronously using a [JOB](../scheduler/job-scheduler.md).
- - You can use [Stream Load](./import-way/stream-load-manua) or [Doris
Streamloader](../../ecosystem/doris-streamloader.md) to write local files into
Doris.
+ - You can use [Stream Load](./import-way/stream-load-manual) or [Doris
Streamloader](../../ecosystem/doris-streamloader.md) to write local files into
Doris.
- **External Data Source Integration**: Query and partially import data from
external sources (e.g., Hive, JDBC, Iceberg) into Doris tables.
@@ -70,4 +70,4 @@ Doris's import process mainly involves various aspects such
as data sources, dat
| [INSERT INTO SELECT](./import-way/insert-into-manual.md) | Importing from an
external source like a table in a catalog or files in s3. | SQL
| Depending on memory size | Synchronous, Asynchronous via Job |
| [Routine Load](./import-way/routine-load-manual.md) | Real-time import
from Kafka | csv, json | Micro-batch
import MB to GB | Asynchronous |
| [MySQL Load](./import-way/mysql-load-manual.md) | Importing from
local files. | csv | Less than
1GB | Synchronous |
-| [Group Commit](./import-way/group-commit-manual.md) | Writing with
high frequency. | Depending on the import method
used | Micro-batch import KB | - |
\ No newline at end of file
+| [Group Commit](./import-way/group-commit-manual.md) | Writing with
high frequency. | Depending on the import method
used | Micro-batch import KB | - |
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/load-manual.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/load-manual.md
index 7e056884b3..9cd401158e 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/load-manual.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/load-manual.md
@@ -32,7 +32,7 @@ Apache Doris 提供了多种导入和集成数据的方法,您可以使用合
- 并发较高或者频次较高(大于 20 并发或者 1 分钟写入多次)时建议打开 [Group
Commit](./import-way/group-commit-manual.md),使用 JDBC INSERT 或者 Stream Load 写入数据。
- - 吞吐较高时推荐使用 [Stream Load](./import-way/stream-load-manua) 通过 HTTP 写入数据。
+ - 吞吐较高时推荐使用 [Stream Load](./import-way/stream-load-manual) 通过 HTTP 写入数据。
- **流式同步**:通过实时数据流(如 Flink、Kafka、事务数据库)将数据实时导入到 Doris 表中,适用于需要实时分析和查询的场景。
@@ -47,7 +47,7 @@ Apache Doris 提供了多种导入和集成数据的方法,您可以使用合
- 可以使用 [INSERT INTO SELECT](./import-way/insert-into-manual.md) 将 S3、HDFS
和 NAS 中的文件同步写入到 Doris 中,配合 [JOB](../scheduler/job-scheduler.md) 可以异步写入。
- - 可以使用 [Stream Load](./import-way/stream-load-manua) 或者 [Doris
Streamloader](../../ecosystem/doris-streamloader.md) 将本地文件写入 Doris 中。
+ - 可以使用 [Stream Load](./import-way/stream-load-manual) 或者 [Doris
Streamloader](../../ecosystem/doris-streamloader.md) 将本地文件写入 Doris 中。
- **外部数据源集成**:通过与外部数据源(如 Hive、JDBC、Iceberg 等)的集成,实现对外部数据的查询和部分数据导入到 Doris 表中。
- 可以创建 [Catalog](../../lakehouse/lakehouse-overview.md) 读取外部数据源中的数据,使用
[INSERT INTO SELECT](./import-way/insert-into-manual.md) 将外部数据源中的数据同步写入到 Doris
中,配合 [JOB](../scheduler/job-scheduler.md) 可以异步写入。
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/data-operate/import/load-manual.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/data-operate/import/load-manual.md
index 7e056884b3..9cd401158e 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/data-operate/import/load-manual.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/data-operate/import/load-manual.md
@@ -32,7 +32,7 @@ Apache Doris 提供了多种导入和集成数据的方法,您可以使用合
- 并发较高或者频次较高(大于 20 并发或者 1 分钟写入多次)时建议打开 [Group
Commit](./import-way/group-commit-manual.md),使用 JDBC INSERT 或者 Stream Load 写入数据。
- - 吞吐较高时推荐使用 [Stream Load](./import-way/stream-load-manua) 通过 HTTP 写入数据。
+ - 吞吐较高时推荐使用 [Stream Load](./import-way/stream-load-manual) 通过 HTTP 写入数据。
- **流式同步**:通过实时数据流(如 Flink、Kafka、事务数据库)将数据实时导入到 Doris 表中,适用于需要实时分析和查询的场景。
@@ -47,7 +47,7 @@ Apache Doris 提供了多种导入和集成数据的方法,您可以使用合
- 可以使用 [INSERT INTO SELECT](./import-way/insert-into-manual.md) 将 S3、HDFS
和 NAS 中的文件同步写入到 Doris 中,配合 [JOB](../scheduler/job-scheduler.md) 可以异步写入。
- - 可以使用 [Stream Load](./import-way/stream-load-manua) 或者 [Doris
Streamloader](../../ecosystem/doris-streamloader.md) 将本地文件写入 Doris 中。
+ - 可以使用 [Stream Load](./import-way/stream-load-manual) 或者 [Doris
Streamloader](../../ecosystem/doris-streamloader.md) 将本地文件写入 Doris 中。
- **外部数据源集成**:通过与外部数据源(如 Hive、JDBC、Iceberg 等)的集成,实现对外部数据的查询和部分数据导入到 Doris 表中。
- 可以创建 [Catalog](../../lakehouse/lakehouse-overview.md) 读取外部数据源中的数据,使用
[INSERT INTO SELECT](./import-way/insert-into-manual.md) 将外部数据源中的数据同步写入到 Doris
中,配合 [JOB](../scheduler/job-scheduler.md) 可以异步写入。
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/data-operate/import/load-manual.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/data-operate/import/load-manual.md
index 7e056884b3..9cd401158e 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/data-operate/import/load-manual.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/data-operate/import/load-manual.md
@@ -32,7 +32,7 @@ Apache Doris 提供了多种导入和集成数据的方法,您可以使用合
- 并发较高或者频次较高(大于 20 并发或者 1 分钟写入多次)时建议打开 [Group
Commit](./import-way/group-commit-manual.md),使用 JDBC INSERT 或者 Stream Load 写入数据。
- - 吞吐较高时推荐使用 [Stream Load](./import-way/stream-load-manua) 通过 HTTP 写入数据。
+ - 吞吐较高时推荐使用 [Stream Load](./import-way/stream-load-manual) 通过 HTTP 写入数据。
- **流式同步**:通过实时数据流(如 Flink、Kafka、事务数据库)将数据实时导入到 Doris 表中,适用于需要实时分析和查询的场景。
@@ -47,7 +47,7 @@ Apache Doris 提供了多种导入和集成数据的方法,您可以使用合
- 可以使用 [INSERT INTO SELECT](./import-way/insert-into-manual.md) 将 S3、HDFS
和 NAS 中的文件同步写入到 Doris 中,配合 [JOB](../scheduler/job-scheduler.md) 可以异步写入。
- - 可以使用 [Stream Load](./import-way/stream-load-manua) 或者 [Doris
Streamloader](../../ecosystem/doris-streamloader.md) 将本地文件写入 Doris 中。
+ - 可以使用 [Stream Load](./import-way/stream-load-manual) 或者 [Doris
Streamloader](../../ecosystem/doris-streamloader.md) 将本地文件写入 Doris 中。
- **外部数据源集成**:通过与外部数据源(如 Hive、JDBC、Iceberg 等)的集成,实现对外部数据的查询和部分数据导入到 Doris 表中。
- 可以创建 [Catalog](../../lakehouse/lakehouse-overview.md) 读取外部数据源中的数据,使用
[INSERT INTO SELECT](./import-way/insert-into-manual.md) 将外部数据源中的数据同步写入到 Doris
中,配合 [JOB](../scheduler/job-scheduler.md) 可以异步写入。
diff --git a/versioned_docs/version-2.1/data-operate/import/load-manual.md
b/versioned_docs/version-2.1/data-operate/import/load-manual.md
index 6dcf57e666..c7c34cdda8 100644
--- a/versioned_docs/version-2.1/data-operate/import/load-manual.md
+++ b/versioned_docs/version-2.1/data-operate/import/load-manual.md
@@ -32,7 +32,7 @@ Apache Doris offers various methods for importing and
integrating data, allowing
- For higher concurrency or frequency (more than 20 concurrent writes or
multiple writes per minute), you can enable enable [Group
Commit](./import-way/group-commit-manual.md) and use JDBC INSERT or Stream Load.
- - For high throughput, you can use [Stream
Load](./import-way/stream-load-manua) via HTTP.
+ - For high throughput, you can use [Stream
Load](./import-way/stream-load-manual) via HTTP.
- **Streaming Synchronization**: Real-time data streams (e.g., Flink, Kafka,
transactional databases) are imported into Doris tables, ideal for real-time
analysis and querying.
@@ -48,7 +48,7 @@ Apache Doris offers various methods for importing and
integrating data, allowing
- You can use [INSERT INTO SELECT](./import-way/insert-into-manual.md) to
synchronously load files from S3, HDFS, and NAS into Doris, and you can perform
the operation asynchronously using a [JOB](../scheduler/job-scheduler.md).
- - You can use [Stream Load](./import-way/stream-load-manua) or [Doris
Streamloader](../../ecosystem/doris-streamloader.md) to write local files into
Doris.
+ - You can use [Stream Load](./import-way/stream-load-manual) or [Doris
Streamloader](../../ecosystem/doris-streamloader.md) to write local files into
Doris.
- **External Data Source Integration**: Query and partially import data from
external sources (e.g., Hive, JDBC, Iceberg) into Doris tables.
@@ -70,4 +70,4 @@ Doris's import process mainly involves various aspects such
as data sources, dat
| [INSERT INTO SELECT](./import-way/insert-into-manual.md) | Importing from an
external source like a table in a catalog or files in s3. | SQL
| Depending on memory size | Synchronous, Asynchronous via Job |
| [Routine Load](./import-way/routine-load-manual.md) | Real-time import
from Kafka | csv, json | Micro-batch
import MB to GB | Asynchronous |
| [MySQL Load](./import-way/mysql-load-manual.md) | Importing from
local files. | csv | Less than
1GB | Synchronous |
-| [Group Commit](./import-way/group-commit-manual.md) | Writing with
high frequency. | Depending on the import method
used | Micro-batch import KB | - |
\ No newline at end of file
+| [Group Commit](./import-way/group-commit-manual.md) | Writing with
high frequency. | Depending on the import method
used | Micro-batch import KB | - |
diff --git a/versioned_docs/version-3.0/data-operate/import/load-manual.md
b/versioned_docs/version-3.0/data-operate/import/load-manual.md
index 6dcf57e666..c7c34cdda8 100644
--- a/versioned_docs/version-3.0/data-operate/import/load-manual.md
+++ b/versioned_docs/version-3.0/data-operate/import/load-manual.md
@@ -32,7 +32,7 @@ Apache Doris offers various methods for importing and
integrating data, allowing
- For higher concurrency or frequency (more than 20 concurrent writes or
multiple writes per minute), you can enable enable [Group
Commit](./import-way/group-commit-manual.md) and use JDBC INSERT or Stream Load.
- - For high throughput, you can use [Stream
Load](./import-way/stream-load-manua) via HTTP.
+ - For high throughput, you can use [Stream
Load](./import-way/stream-load-manual) via HTTP.
- **Streaming Synchronization**: Real-time data streams (e.g., Flink, Kafka,
transactional databases) are imported into Doris tables, ideal for real-time
analysis and querying.
@@ -48,7 +48,7 @@ Apache Doris offers various methods for importing and
integrating data, allowing
- You can use [INSERT INTO SELECT](./import-way/insert-into-manual.md) to
synchronously load files from S3, HDFS, and NAS into Doris, and you can perform
the operation asynchronously using a [JOB](../scheduler/job-scheduler.md).
- - You can use [Stream Load](./import-way/stream-load-manua) or [Doris
Streamloader](../../ecosystem/doris-streamloader.md) to write local files into
Doris.
+ - You can use [Stream Load](./import-way/stream-load-manual) or [Doris
Streamloader](../../ecosystem/doris-streamloader.md) to write local files into
Doris.
- **External Data Source Integration**: Query and partially import data from
external sources (e.g., Hive, JDBC, Iceberg) into Doris tables.
@@ -70,4 +70,4 @@ Doris's import process mainly involves various aspects such
as data sources, dat
| [INSERT INTO SELECT](./import-way/insert-into-manual.md) | Importing from an
external source like a table in a catalog or files in s3. | SQL
| Depending on memory size | Synchronous, Asynchronous via Job |
| [Routine Load](./import-way/routine-load-manual.md) | Real-time import
from Kafka | csv, json | Micro-batch
import MB to GB | Asynchronous |
| [MySQL Load](./import-way/mysql-load-manual.md) | Importing from
local files. | csv | Less than
1GB | Synchronous |
-| [Group Commit](./import-way/group-commit-manual.md) | Writing with
high frequency. | Depending on the import method
used | Micro-batch import KB | - |
\ No newline at end of file
+| [Group Commit](./import-way/group-commit-manual.md) | Writing with
high frequency. | Depending on the import method
used | Micro-batch import KB | - |
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]