This is an automated email from the ASF dual-hosted git repository.

kxiao pushed a commit to branch branch-2.0
in repository https://gitbox.apache.org/repos/asf/doris.git

commit 54520b12e19637494848e8d190e64af414a8c24e
Author: wudi <[email protected]>
AuthorDate: Fri Jul 21 16:03:10 2023 +0800

    [doc](routineload)add routine load ssl example for access ali-kafka (#21877)
---
 .../import/import-way/routine-load-manual.md       | 36 +++++++++++++++--
 .../import/import-way/routine-load-manual.md       | 46 +++++++++++++++++-----
 2 files changed, 70 insertions(+), 12 deletions(-)

diff --git a/docs/en/docs/data-operate/import/import-way/routine-load-manual.md 
b/docs/en/docs/data-operate/import/import-way/routine-load-manual.md
index 85ad44a74e..d7dba381f2 100644
--- a/docs/en/docs/data-operate/import/import-way/routine-load-manual.md
+++ b/docs/en/docs/data-operate/import/import-way/routine-load-manual.md
@@ -165,7 +165,7 @@ eg: user_address data format
 
 ```
     
user_address|{"user_id":128787321878,"address":"朝阳区朝阳大厦XXX号","timestamp":1589191587}
- ```
+```
 eg: user_info data format
 ```
     
user_info|{"user_id":128787321878,"name":"张三","age":18,"timestamp":1589191587}
@@ -261,7 +261,7 @@ eg: user_info data format
              "address":"Los Angeles, CA, USA",
              "timestamp":1589191587
          }
-   ```
+```
 
 Create the Doris data table to be imported
 
@@ -410,6 +410,36 @@ FROM KAFKA
 >
 >
 
+**Access Alibaba Cloud Message Queue Kafka Cluster((Access Point Type is 
SSL))**
+
+```sql
+#Upload certificate file address, 
address:https://github.com/AliwareMQ/aliware-kafka-demos/blob/master/kafka-cpp-demo/vpc-ssl/only-4096-ca-cert
+CREATE FILE "ca.pem" PROPERTIES("url" = "http://xxx/only-4096-ca-cert";, 
"catalog" = "kafka");
+
+# create routine load job
+CREATE ROUTINE LOAD test.test_job on test_tbl
+PROPERTIES
+(
+    "desired_concurrent_number"="1",
+    "format" = "json"
+)
+FROM KAFKA
+(
+    "kafka_broker_list"= "xxx.alikafka.aliyuncs.com:9093",
+    "kafka_topic" = "test",
+    "property.group.id" = "test_group",
+    "property.client.id" = "test_group",
+    "property.security.protocol"="ssl",
+    "property.ssl.ca.location"="FILE:ca.pem",
+    "property.security.protocol"="sasl_ssl",
+    "property.sasl.mechanism"="PLAIN",
+    "property.sasl.username"="xxx",
+    "property.sasl.password"="xxx"
+);
+```
+
+
+
 **Access the PLAIN certified Kafka cluster**
 
 To access a Kafka cluster with PLAIN authentication enabled, you need to add 
the following configuration:
@@ -435,7 +465,7 @@ To access a Kafka cluster with PLAIN authentication 
enabled, you need to add the
         "property.sasl.username"="admin",
         "property.sasl.password"="admin"
     );
-
+    
     ```
 
 <version since="1.2">
diff --git 
a/docs/zh-CN/docs/data-operate/import/import-way/routine-load-manual.md 
b/docs/zh-CN/docs/data-operate/import/import-way/routine-load-manual.md
index db6c30ea12..754430b9b3 100644
--- a/docs/zh-CN/docs/data-operate/import/import-way/routine-load-manual.md
+++ b/docs/zh-CN/docs/data-operate/import/import-way/routine-load-manual.md
@@ -153,7 +153,7 @@ eg: user_address 表的 json 数据
     
 ```
     
user_address|{"user_id":128787321878,"address":"朝阳区朝阳大厦XXX号","timestamp":1589191587}
- ```
+```
 eg: user_info 表的 json 数据
 ```
     
user_info|{"user_id":128787321878,"name":"张三","age":18,"timestamp":1589191587}
@@ -248,10 +248,10 @@ eg: user_info 表的 json 数据
              "address":"朝阳区朝阳大厦XXX号",
              "timestamp":1589191587
          }
-   ```
-   
+```
+
    创建待导入的Doris数据表
-   
+
    ```sql
    CREATE TABLE `example_tbl` (
       `category` varchar(24) NULL COMMENT "",
@@ -274,9 +274,9 @@ eg: user_info 表的 json 数据
        "replication_num" = "1"
    );
    ```
-   
+
    以简单模式导入json数据
-   
+
    ```sql
    CREATE ROUTINE LOAD example_db.test_json_label_1 ON table1
    COLUMNS(category,price,author)
@@ -297,9 +297,9 @@ eg: user_info 表的 json 数据
        "kafka_offsets" = "0,0,0"
     );
    ```
-   
+
    精准导入json格式数据
-   
+
    ```sql
    CREATE ROUTINE LOAD example_db.test1 ON example_tbl
    COLUMNS(category, author, price, timestamp, dt=from_unixtime(timestamp, 
'%Y%m%d'))
@@ -389,6 +389,34 @@ eg: user_info 表的 json 数据
 >
 > [https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md](https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md)
 
+**访问阿里云消息队列Kafka集群(接入点类型为SSL)**
+
+```sql
+#上传证书文件地址,地址:https://github.com/AliwareMQ/aliware-kafka-demos/blob/master/kafka-cpp-demo/vpc-ssl/only-4096-ca-cert
+CREATE FILE "ca.pem" PROPERTIES("url" = "http://xxx/only-4096-ca-cert";, 
"catalog" = "kafka");
+
+# 创建任务
+CREATE ROUTINE LOAD test.test_job on test_tbl
+PROPERTIES
+(
+    "desired_concurrent_number"="1",
+    "format" = "json"
+)
+FROM KAFKA
+(
+    "kafka_broker_list"= "xxx.alikafka.aliyuncs.com:9093",
+    "kafka_topic" = "test",
+    "property.group.id" = "test_group",
+    "property.client.id" = "test_group",
+    "property.security.protocol"="ssl",
+    "property.ssl.ca.location"="FILE:ca.pem",
+    "property.security.protocol"="sasl_ssl",
+    "property.sasl.mechanism"="PLAIN",
+    "property.sasl.username"="xxx",
+    "property.sasl.password"="xxx"
+);
+```
+
 **访问 PLAIN 认证的 Kafka 集群**
 
 访问开启 PLAIN 认证的Kafka集群,需要增加以下配置:
@@ -414,7 +442,7 @@ eg: user_info 表的 json 数据
         "property.sasl.username"="admin",
         "property.sasl.password"="admin"
     );
-
+    
     ```
 
 **访问 Kerberos 认证的 Kafka 集群**


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to