This is an automated email from the ASF dual-hosted git repository.

diwu pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/doris-website.git


The following commit(s) were added to refs/heads/master by this push:
     new e6513b54f7e [Fix](ecosystem) fix typo for streaming job tvf (#3481)
e6513b54f7e is described below

commit e6513b54f7e0db9cedee87374568addbab3b8947
Author: wudi <[email protected]>
AuthorDate: Mon Mar 23 11:29:50 2026 +0800

    [Fix](ecosystem) fix typo for streaming job tvf (#3481)
    
    ## Versions
    
    - [x] dev
    - [x] 4.x
    - [ ] 3.x
    - [ ] 2.1
    
    ## Languages
    
    - [] Chinese
    - [x] English
---
 .../import/streaming-job/streaming-job-tvf.md       | 21 ++++++++++-----------
 .../import/streaming-job/streaming-job-tvf.md       | 21 ++++++++++-----------
 2 files changed, 20 insertions(+), 22 deletions(-)

diff --git a/docs/data-operate/import/streaming-job/streaming-job-tvf.md 
b/docs/data-operate/import/streaming-job/streaming-job-tvf.md
index b06b72a9386..f747c306dca 100644
--- a/docs/data-operate/import/streaming-job/streaming-job-tvf.md
+++ b/docs/data-operate/import/streaming-job/streaming-job-tvf.md
@@ -122,7 +122,7 @@ DROP JOB where jobName = <job_name> ;
 
 ### Import command
 
-创建一个 Job + TVF 常驻导入作业语法如下:
+The syntax for creating a Job + TVF continuous import job is as follows:
 
 ```SQL
 CREATE JOB <job_name>
@@ -134,23 +134,22 @@ DO <Insert_Command>
 
 The module description is as follows:
 
-| Module | Description |
-
+| Module         | Description                                                 
 |
 | -------------- | 
------------------------------------------------------------ |
-| job_name | Task name |
-| job_properties | General import parameters used to specify the Job |
-| comment | Remarks used to describe the Job |
+| job_name       | Task name                                                   
 |
+| job_properties | General import parameters used to specify the Job           
 |
+| comment        | Remarks used to describe the Job                            
 |
 | Insert_Command | SQL to execute; currently only Insert into table select * 
from s3() is supported |
 
 ### Importing Parameters
 
 #### FE Configuration Parameters
 
-| Parameter | Default Value | |
-| ------------------------------------ | ------ | 
------------------------------------------- |
-| max_streaming_job_num | 1024 | Maximum number of Streaming jobs |
-| job_streaming_task_exec_thread_num | 10 | Number of threads used to execute 
StreamingTasks |
-| max_streaming_task_show_count | 100 | Maximum number of task execution 
records kept in memory for a StreamingTask |
+| Parameter                          | Default Value | Description             
                                     |
+| ---------------------------------- | ------------- | 
------------------------------------------------------------ |
+| max_streaming_job_num              | 1024          | Maximum number of 
Streaming jobs                             |
+| job_streaming_task_exec_thread_num | 10            | Number of threads used 
to execute StreamingTasks             |
+| max_streaming_task_show_count      | 100           | Maximum number of task 
execution records kept in memory for a StreamingTask |
 
 #### Import Configuration Parameters
 
diff --git 
a/versioned_docs/version-4.x/data-operate/import/streaming-job/streaming-job-tvf.md
 
b/versioned_docs/version-4.x/data-operate/import/streaming-job/streaming-job-tvf.md
index b06b72a9386..f747c306dca 100644
--- 
a/versioned_docs/version-4.x/data-operate/import/streaming-job/streaming-job-tvf.md
+++ 
b/versioned_docs/version-4.x/data-operate/import/streaming-job/streaming-job-tvf.md
@@ -122,7 +122,7 @@ DROP JOB where jobName = <job_name> ;
 
 ### Import command
 
-创建一个 Job + TVF 常驻导入作业语法如下:
+The syntax for creating a Job + TVF continuous import job is as follows:
 
 ```SQL
 CREATE JOB <job_name>
@@ -134,23 +134,22 @@ DO <Insert_Command>
 
 The module description is as follows:
 
-| Module | Description |
-
+| Module         | Description                                                 
 |
 | -------------- | 
------------------------------------------------------------ |
-| job_name | Task name |
-| job_properties | General import parameters used to specify the Job |
-| comment | Remarks used to describe the Job |
+| job_name       | Task name                                                   
 |
+| job_properties | General import parameters used to specify the Job           
 |
+| comment        | Remarks used to describe the Job                            
 |
 | Insert_Command | SQL to execute; currently only Insert into table select * 
from s3() is supported |
 
 ### Importing Parameters
 
 #### FE Configuration Parameters
 
-| Parameter | Default Value | |
-| ------------------------------------ | ------ | 
------------------------------------------- |
-| max_streaming_job_num | 1024 | Maximum number of Streaming jobs |
-| job_streaming_task_exec_thread_num | 10 | Number of threads used to execute 
StreamingTasks |
-| max_streaming_task_show_count | 100 | Maximum number of task execution 
records kept in memory for a StreamingTask |
+| Parameter                          | Default Value | Description             
                                     |
+| ---------------------------------- | ------------- | 
------------------------------------------------------------ |
+| max_streaming_job_num              | 1024          | Maximum number of 
Streaming jobs                             |
+| job_streaming_task_exec_thread_num | 10            | Number of threads used 
to execute StreamingTasks             |
+| max_streaming_task_show_count      | 100           | Maximum number of task 
execution records kept in memory for a StreamingTask |
 
 #### Import Configuration Parameters
 


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to