jjiey commented on a change in pull request #14896:
URL: https://github.com/apache/flink/pull/14896#discussion_r579996787



##########
File path: docs/content.zh/docs/connectors/table/formats/avro-confluent.md
##########
@@ -115,41 +115,40 @@ CREATE TABLE user_created (
   'topic' = 'user_events_example2',
   'properties.bootstrap.servers' = 'localhost:9092',
 
-  -- Watch out: schema evolution in the context of a Kafka key is almost never 
backward nor
-  -- forward compatible due to hash partitioning.
+  -- 注意:由于哈希分区,在 Kafka key 的上下文中,schema 升级几乎从不向后也不向前兼容。
   'key.format' = 'avro-confluent',
   'key.avro-confluent.schema-registry.url' = 'http://localhost:8082',
   'key.fields' = 'kafka_key_id',
 
-  -- In this example, we want the Avro types of both the Kafka key and value 
to contain the field 'id'
-  -- => adding a prefix to the table column associated to the Kafka key field 
avoids clashes
+  -- 在本例中,我们希望 Kafka 的 key 和 value 的 Avro 类型都包含 'id' 字段
+  -- => 给表中与 Kafka key 字段关联的列添加一个前缀来避免冲突
   'key.fields-prefix' = 'kafka_key_',
 
   'value.format' = 'avro-confluent',
   'value.avro-confluent.schema-registry.url' = 'http://localhost:8082',
   'value.fields-include' = 'EXCEPT_KEY',
    
-  -- subjects have a default value since Flink 1.13, though can be overriden:
+  -- 自 Flink 1.13 起,subjects 具有一个默认值, 但是可以被覆盖:
   'key.avro-confluent.schema-registry.subject' = 'user_events_example2-key2',
   'value.avro-confluent.schema-registry.subject' = 
'user_events_example2-value2'
 )
 ```
 
 ---
-Example of a table using the upsert connector with the Kafka value registered 
as an Avro record in the Schema Registry:
+使用 upsert-kafka 连接器,Kafka 的 value 在 Schema Registry 中注册为 Avro 记录的表的示例:
 
 ```sql
 CREATE TABLE user_created (
   
-  -- one column mapped to the Kafka raw UTF-8 key
+  -- 该列映射到 Kafka 原始的 UTF-8 key
   kafka_key_id STRING,
   
-  -- a few columns mapped to the Avro fields of the Kafka value
+  -- 映射到 Kafka value 中的 Avro 字段的一些列
   id STRING, 
   name STRING, 
   email STRING, 
   
-  -- upsert-kafka connector requires a primary key to define the upsert 
behavior
+  -- upsert-kafka 连接器需要一个主键来定义 upsert 行为

Review comment:
       I guess you don't mean to fix this line, maybe the line 132 of the 
English doc:'Example of a table using the upsert connector with the Kafka value 
registered as an Avro record in the Schema Registry:', I should fix 'upsert 
connector' to 'upsert-kafka connector', right?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to