wuchong commented on a change in pull request #12731:
URL: https://github.com/apache/flink/pull/12731#discussion_r443305164



##########
File path: docs/dev/table/connectors/formats/debezium.zh.md
##########
@@ -1,5 +1,5 @@
 ---
-title: "Debezium Format"
+title: "Debezium 格式化"

Review comment:
       保留 Format 不翻译。 翻译成 "格式化" 也不准确。

##########
File path: docs/dev/table/connectors/formats/debezium.zh.md
##########
@@ -29,32 +29,33 @@ under the License.
 * This will be replaced by the TOC
 {:toc}
 
-[Debezium](https://debezium.io/) is a CDC (Changelog Data Capture) tool that 
can stream changes in real-time from MySQL, PostgreSQL, Oracle, Microsoft SQL 
Server and many other databases into Kafka. Debezium provides a unified format 
schema for changelog and supports to serialize messages using JSON and [Apache 
Avro](https://avro.apache.org/).
+[Debezium](https://debezium.io/) Debezium 是一个 CDC(Changelog数据捕获)的工具,可以把来自 
MySQL、PostgreSQL、Oracle、Microsoft SQL Server 和许多其他数据库的更改实时流式传输到 Kafka 中。 
Debezium 为变更日志提供了统一的格式结构,并支持使用 JSON 和 Apache Avro 序列化消息。

Review comment:
       ```suggestion
   [Debezium](https://debezium.io/) Debezium 是一个 CDC(Changelog Data 
Capture,变更数据捕获)的工具,可以把来自 MySQL、PostgreSQL、Oracle、Microsoft SQL Server 
和许多其他数据库的更改实时流式传输到 Kafka 中。 Debezium 为变更日志提供了统一的格式结构,并支持使用 JSON 和 Apache Avro 
序列化消息。
   ```

##########
File path: docs/dev/table/connectors/formats/debezium.zh.md
##########
@@ -29,32 +29,33 @@ under the License.
 * This will be replaced by the TOC
 {:toc}
 
-[Debezium](https://debezium.io/) is a CDC (Changelog Data Capture) tool that 
can stream changes in real-time from MySQL, PostgreSQL, Oracle, Microsoft SQL 
Server and many other databases into Kafka. Debezium provides a unified format 
schema for changelog and supports to serialize messages using JSON and [Apache 
Avro](https://avro.apache.org/).
+[Debezium](https://debezium.io/) Debezium 是一个 CDC(Changelog数据捕获)的工具,可以把来自 
MySQL、PostgreSQL、Oracle、Microsoft SQL Server 和许多其他数据库的更改实时流式传输到 Kafka 中。 
Debezium 为变更日志提供了统一的格式结构,并支持使用 JSON 和 Apache Avro 序列化消息。
 
-Flink supports to interpret Debezium JSON messages as INSERT/UPDATE/DELETE 
messages into Flink SQL system. This is useful in many cases to leverage this 
feature, such as
- - synchronizing incremental data from databases to other systems
- - auditing logs
- - real-time materialized views on databases
- - temporal join changing history of a database table and so on.
+Flink 支持将 Debezium JSON 消息解释为 INSERT / UPDATE / DELETE 消息到 Flink SQL 
系统中。在很多情况下,利用这个特性非常的有用,例如
+ - 将增量数据从数据库同步到其他系统
+ - 审核日志
+ - 关于数据库的实时物化视图
+ - 临时联接更改数据库表的历史记录等等。
 
-*Note: Support for interpreting Debezium Avro messages and emitting Debezium 
messages is on the roadmap.*
+*注意: 路线图上支持解释 Debezium Avro 消息和发出 Debezium 消息。*
 
-Dependencies
+依赖
 ------------
 
-In order to setup the Debezium format, the following table provides dependency 
information for both projects using a build automation tool (such as Maven or 
SBT) and SQL Client with SQL JAR bundles.
+为了设置 Debezium 格式,下表提供了使用构建自动化工具(例如 Maven 或 SBT)和带有 SQL JAR 包的 SQL Client 
的两个项目的依赖项信息。
 
 | Maven dependency   | SQL Client JAR         |

Review comment:
       Maven 依赖

##########
File path: docs/dev/table/connectors/formats/debezium.zh.md
##########
@@ -77,16 +78,15 @@ Debezium provides a unified format for changelog, here is a 
simple example for a
 }
 ```
 
-*Note: please refer to [Debezium 
documentation](https://debezium.io/documentation/reference/1.1/connectors/mysql.html#mysql-connector-events_debezium)
 about the meaning of each fields.*
+*注意: 请参考 [Debezium 
documentation](https://debezium.io/documentation/reference/1.1/connectors/mysql.html#mysql-connector-events_debezium)
 文档,了解每个字段的含义。*

Review comment:
       ```suggestion
   *注意: 请参考 [Debezium 
文档](https://debezium.io/documentation/reference/1.1/connectors/mysql.html#mysql-connector-events_debezium),了解每个字段的含义。*
   ```

##########
File path: docs/dev/table/connectors/formats/debezium.zh.md
##########
@@ -128,27 +128,27 @@ In some cases, users may setup the Debezium Kafka Connect 
with the Kafka configu
 }
 ```
 
-In order to interpret such messages, you need to add the option 
`'debezium-json.schema-include' = 'true'` into above DDL WITH clause (`false` 
by default). Usually, this is not recommended to include schema because this 
makes the messages very verbose and reduces parsing performance.
+为了说明这一类信息,你需要在上述 DDL WITH 子句中添加选项'debezium-json.schema-include'='true'(默认为 
false)。通常情况下,建议不要包含结构的描述,因为这样会使消息变得非常冗长,并降低解析性能。
 
-After registering the topic as a Flink table, then you can consume the 
Debezium messages as a changelog source.
+在将主题注册为 Flink 表之后,可以将 Debezium 消息用作变更日志源。
 
 <div class="codetabs" markdown="1">
 <div data-lang="SQL" markdown="1">
 {% highlight sql %}
--- a real-time materialized view on the MySQL "products"
--- which calculate the latest average of weight for the same products
+-- MySQL “products” 的实时物化视图
+-- 计算相同产品的最新平均重量
 SELECT name, AVG(weight) FROM topic_products GROUP BY name;
 
--- synchronize all the data and incremental changes of MySQL "products" table 
to
--- Elasticsearch "products" index for future searching
+-- 将 MySQL “products” 表的所有数据和增量更改同步到
+-- Elasticsearch “products” 索引,供将来查找
 INSERT INTO elasticsearch_products
 SELECT * FROM topic_products;
 {% endhighlight %}
 </div>
 </div>
 
 
-Format Options
+格式选项

Review comment:
       表格内容也要翻译。

##########
File path: docs/dev/table/connectors/formats/debezium.zh.md
##########
@@ -29,32 +29,33 @@ under the License.
 * This will be replaced by the TOC
 {:toc}
 
-[Debezium](https://debezium.io/) is a CDC (Changelog Data Capture) tool that 
can stream changes in real-time from MySQL, PostgreSQL, Oracle, Microsoft SQL 
Server and many other databases into Kafka. Debezium provides a unified format 
schema for changelog and supports to serialize messages using JSON and [Apache 
Avro](https://avro.apache.org/).
+[Debezium](https://debezium.io/) Debezium 是一个 CDC(Changelog数据捕获)的工具,可以把来自 
MySQL、PostgreSQL、Oracle、Microsoft SQL Server 和许多其他数据库的更改实时流式传输到 Kafka 中。 
Debezium 为变更日志提供了统一的格式结构,并支持使用 JSON 和 Apache Avro 序列化消息。
 
-Flink supports to interpret Debezium JSON messages as INSERT/UPDATE/DELETE 
messages into Flink SQL system. This is useful in many cases to leverage this 
feature, such as
- - synchronizing incremental data from databases to other systems
- - auditing logs
- - real-time materialized views on databases
- - temporal join changing history of a database table and so on.
+Flink 支持将 Debezium JSON 消息解释为 INSERT / UPDATE / DELETE 消息到 Flink SQL 
系统中。在很多情况下,利用这个特性非常的有用,例如
+ - 将增量数据从数据库同步到其他系统
+ - 审核日志
+ - 关于数据库的实时物化视图
+ - 临时联接更改数据库表的历史记录等等。
 
-*Note: Support for interpreting Debezium Avro messages and emitting Debezium 
messages is on the roadmap.*
+*注意: 路线图上支持解释 Debezium Avro 消息和发出 Debezium 消息。*

Review comment:
       ```suggestion
   *注意: 支持解析 Debezium Avro 消息和输出 Debezium 消息已经规划在路线图上了。*
   ```

##########
File path: docs/dev/table/connectors/formats/debezium.zh.md
##########
@@ -29,32 +29,33 @@ under the License.
 * This will be replaced by the TOC
 {:toc}
 
-[Debezium](https://debezium.io/) is a CDC (Changelog Data Capture) tool that 
can stream changes in real-time from MySQL, PostgreSQL, Oracle, Microsoft SQL 
Server and many other databases into Kafka. Debezium provides a unified format 
schema for changelog and supports to serialize messages using JSON and [Apache 
Avro](https://avro.apache.org/).
+[Debezium](https://debezium.io/) Debezium 是一个 CDC(Changelog数据捕获)的工具,可以把来自 
MySQL、PostgreSQL、Oracle、Microsoft SQL Server 和许多其他数据库的更改实时流式传输到 Kafka 中。 
Debezium 为变更日志提供了统一的格式结构,并支持使用 JSON 和 Apache Avro 序列化消息。
 
-Flink supports to interpret Debezium JSON messages as INSERT/UPDATE/DELETE 
messages into Flink SQL system. This is useful in many cases to leverage this 
feature, such as
- - synchronizing incremental data from databases to other systems
- - auditing logs
- - real-time materialized views on databases
- - temporal join changing history of a database table and so on.
+Flink 支持将 Debezium JSON 消息解释为 INSERT / UPDATE / DELETE 消息到 Flink SQL 
系统中。在很多情况下,利用这个特性非常的有用,例如

Review comment:
       ```suggestion
   Flink 支持将 Debezium JSON 消息解析为 INSERT / UPDATE / DELETE 消息到 Flink SQL 
系统中。在很多情况下,利用这个特性非常的有用,例如
   ```

##########
File path: docs/dev/table/connectors/formats/debezium.zh.md
##########
@@ -29,32 +29,33 @@ under the License.
 * This will be replaced by the TOC
 {:toc}
 
-[Debezium](https://debezium.io/) is a CDC (Changelog Data Capture) tool that 
can stream changes in real-time from MySQL, PostgreSQL, Oracle, Microsoft SQL 
Server and many other databases into Kafka. Debezium provides a unified format 
schema for changelog and supports to serialize messages using JSON and [Apache 
Avro](https://avro.apache.org/).
+[Debezium](https://debezium.io/) Debezium 是一个 CDC(Changelog数据捕获)的工具,可以把来自 
MySQL、PostgreSQL、Oracle、Microsoft SQL Server 和许多其他数据库的更改实时流式传输到 Kafka 中。 
Debezium 为变更日志提供了统一的格式结构,并支持使用 JSON 和 Apache Avro 序列化消息。
 
-Flink supports to interpret Debezium JSON messages as INSERT/UPDATE/DELETE 
messages into Flink SQL system. This is useful in many cases to leverage this 
feature, such as
- - synchronizing incremental data from databases to other systems
- - auditing logs
- - real-time materialized views on databases
- - temporal join changing history of a database table and so on.
+Flink 支持将 Debezium JSON 消息解释为 INSERT / UPDATE / DELETE 消息到 Flink SQL 
系统中。在很多情况下,利用这个特性非常的有用,例如
+ - 将增量数据从数据库同步到其他系统
+ - 审核日志
+ - 关于数据库的实时物化视图
+ - 临时联接更改数据库表的历史记录等等。
 
-*Note: Support for interpreting Debezium Avro messages and emitting Debezium 
messages is on the roadmap.*
+*注意: 路线图上支持解释 Debezium Avro 消息和发出 Debezium 消息。*
 
-Dependencies
+依赖
 ------------
 
-In order to setup the Debezium format, the following table provides dependency 
information for both projects using a build automation tool (such as Maven or 
SBT) and SQL Client with SQL JAR bundles.
+为了设置 Debezium 格式,下表提供了使用构建自动化工具(例如 Maven 或 SBT)和带有 SQL JAR 包的 SQL Client 
的两个项目的依赖项信息。
 
 | Maven dependency   | SQL Client JAR         |
 | :----------------- | :----------------------|
 | `flink-json`       | Built-in               |
 
-*Note: please refer to [Debezium 
documentation](https://debezium.io/documentation/reference/1.1/index.html) 
about how to setup a Debezium Kafka Connect to synchronize changelog to Kafka 
topics.*
+*注意: 请参考 [Debezium 
documentation](https://debezium.io/documentation/reference/1.1/index.html) 
文档,了解如何设置 Debezium Kafka Connect 用来将变更日志同步到 Kafka 主题。*
 
 
-How to use Debezium format
+如何使用 Debezium 格式

Review comment:
       ```suggestion
   如何使用 Debezium Format
   ```

##########
File path: docs/dev/table/connectors/formats/debezium.zh.md
##########
@@ -29,32 +29,33 @@ under the License.
 * This will be replaced by the TOC
 {:toc}
 
-[Debezium](https://debezium.io/) is a CDC (Changelog Data Capture) tool that 
can stream changes in real-time from MySQL, PostgreSQL, Oracle, Microsoft SQL 
Server and many other databases into Kafka. Debezium provides a unified format 
schema for changelog and supports to serialize messages using JSON and [Apache 
Avro](https://avro.apache.org/).
+[Debezium](https://debezium.io/) Debezium 是一个 CDC(Changelog数据捕获)的工具,可以把来自 
MySQL、PostgreSQL、Oracle、Microsoft SQL Server 和许多其他数据库的更改实时流式传输到 Kafka 中。 
Debezium 为变更日志提供了统一的格式结构,并支持使用 JSON 和 Apache Avro 序列化消息。
 
-Flink supports to interpret Debezium JSON messages as INSERT/UPDATE/DELETE 
messages into Flink SQL system. This is useful in many cases to leverage this 
feature, such as
- - synchronizing incremental data from databases to other systems
- - auditing logs
- - real-time materialized views on databases
- - temporal join changing history of a database table and so on.
+Flink 支持将 Debezium JSON 消息解释为 INSERT / UPDATE / DELETE 消息到 Flink SQL 
系统中。在很多情况下,利用这个特性非常的有用,例如
+ - 将增量数据从数据库同步到其他系统
+ - 审核日志
+ - 关于数据库的实时物化视图
+ - 临时联接更改数据库表的历史记录等等。

Review comment:
       ```suggestion
    - 关联维度数据库的变更历史,等等。
   ```

##########
File path: docs/dev/table/connectors/formats/debezium.zh.md
##########
@@ -77,16 +78,15 @@ Debezium provides a unified format for changelog, here is a 
simple example for a
 }
 ```
 
-*Note: please refer to [Debezium 
documentation](https://debezium.io/documentation/reference/1.1/connectors/mysql.html#mysql-connector-events_debezium)
 about the meaning of each fields.*
+*注意: 请参考 [Debezium 
documentation](https://debezium.io/documentation/reference/1.1/connectors/mysql.html#mysql-connector-events_debezium)
 文档,了解每个字段的含义。*
 
-The MySQL `products` table has 4 columns (`id`, `name`, `description` and 
`weight`). The above JSON message is an update change event on the `products` 
table where the `weight` value of the row with `id = 111` is changed from 
`5.18` to `5.15`.
-Assuming this messages is synchronized to Kafka topic `products_binlog`, then 
we can use the following DDL to consume this topic and interpret the change 
events.
+MySQL 产品表有4列(id、name、description、weight)。上面的 JSON 消息是 products 表上的 update 
change 事件,其中 id = 111 的行的 weight 值从 5.18 更改为 5.15。假设此消息已同步到 Kafka 主题 
products_binlog,则可以使用以下 DDL 来使用此主题并解释更改事件。

Review comment:
       ```suggestion
   MySQL 产品表有4列(`id`、`name`、`description`、`weight`)。上面的 JSON 消息是 `products` 
表上的一条更新事件,其中 `id = 111` 的行的 `weight` 值从 `5.18` 更改为 `5.15`。假设此消息已同步到 Kafka 主题 
`products_binlog`,则可以使用以下 DDL 来使用此主题并解析更新事件。
   ```

##########
File path: docs/dev/table/connectors/formats/debezium.zh.md
##########
@@ -29,32 +29,33 @@ under the License.
 * This will be replaced by the TOC
 {:toc}
 
-[Debezium](https://debezium.io/) is a CDC (Changelog Data Capture) tool that 
can stream changes in real-time from MySQL, PostgreSQL, Oracle, Microsoft SQL 
Server and many other databases into Kafka. Debezium provides a unified format 
schema for changelog and supports to serialize messages using JSON and [Apache 
Avro](https://avro.apache.org/).
+[Debezium](https://debezium.io/) Debezium 是一个 CDC(Changelog数据捕获)的工具,可以把来自 
MySQL、PostgreSQL、Oracle、Microsoft SQL Server 和许多其他数据库的更改实时流式传输到 Kafka 中。 
Debezium 为变更日志提供了统一的格式结构,并支持使用 JSON 和 Apache Avro 序列化消息。
 
-Flink supports to interpret Debezium JSON messages as INSERT/UPDATE/DELETE 
messages into Flink SQL system. This is useful in many cases to leverage this 
feature, such as
- - synchronizing incremental data from databases to other systems
- - auditing logs
- - real-time materialized views on databases
- - temporal join changing history of a database table and so on.
+Flink 支持将 Debezium JSON 消息解释为 INSERT / UPDATE / DELETE 消息到 Flink SQL 
系统中。在很多情况下,利用这个特性非常的有用,例如
+ - 将增量数据从数据库同步到其他系统
+ - 审核日志
+ - 关于数据库的实时物化视图

Review comment:
       ```suggestion
    - 数据库的实时物化视图
   ```

##########
File path: docs/dev/table/connectors/formats/debezium.zh.md
##########
@@ -29,32 +29,33 @@ under the License.
 * This will be replaced by the TOC
 {:toc}
 
-[Debezium](https://debezium.io/) is a CDC (Changelog Data Capture) tool that 
can stream changes in real-time from MySQL, PostgreSQL, Oracle, Microsoft SQL 
Server and many other databases into Kafka. Debezium provides a unified format 
schema for changelog and supports to serialize messages using JSON and [Apache 
Avro](https://avro.apache.org/).
+[Debezium](https://debezium.io/) Debezium 是一个 CDC(Changelog数据捕获)的工具,可以把来自 
MySQL、PostgreSQL、Oracle、Microsoft SQL Server 和许多其他数据库的更改实时流式传输到 Kafka 中。 
Debezium 为变更日志提供了统一的格式结构,并支持使用 JSON 和 Apache Avro 序列化消息。
 
-Flink supports to interpret Debezium JSON messages as INSERT/UPDATE/DELETE 
messages into Flink SQL system. This is useful in many cases to leverage this 
feature, such as
- - synchronizing incremental data from databases to other systems
- - auditing logs
- - real-time materialized views on databases
- - temporal join changing history of a database table and so on.
+Flink 支持将 Debezium JSON 消息解释为 INSERT / UPDATE / DELETE 消息到 Flink SQL 
系统中。在很多情况下,利用这个特性非常的有用,例如
+ - 将增量数据从数据库同步到其他系统
+ - 审核日志

Review comment:
       ```suggestion
    - 日志审计
   ```

##########
File path: docs/dev/table/connectors/formats/debezium.zh.md
##########
@@ -102,7 +102,7 @@ CREATE TABLE topic_products (
 </div>
 </div>
 
-In some cases, users may setup the Debezium Kafka Connect with the Kafka 
configuration `'value.converter.schemas.enable'` enabled to include schema in 
the message. Then the Debezium JSON message may look like this:
+在某些情况下,用户可以使用 Kafka 的配置 “value.converter.schemas.enable” 设置 Debezium Kafka 
Connect,用来在消息中包括结构的描述信息。然后,Debezium JSON 消息可能如下所示:

Review comment:
       ```suggestion
   在某些情况下,用户在设置 Debezium Kafka Connect 时,可能会开启 Kafka 的配置 
`'value.converter.schemas.enable'`,用来在消息体中包含 schema 信息。然后,Debezium JSON 
消息可能如下所示:
   ```

##########
File path: docs/dev/table/connectors/formats/debezium.zh.md
##########
@@ -29,32 +29,33 @@ under the License.
 * This will be replaced by the TOC
 {:toc}
 
-[Debezium](https://debezium.io/) is a CDC (Changelog Data Capture) tool that 
can stream changes in real-time from MySQL, PostgreSQL, Oracle, Microsoft SQL 
Server and many other databases into Kafka. Debezium provides a unified format 
schema for changelog and supports to serialize messages using JSON and [Apache 
Avro](https://avro.apache.org/).
+[Debezium](https://debezium.io/) Debezium 是一个 CDC(Changelog数据捕获)的工具,可以把来自 
MySQL、PostgreSQL、Oracle、Microsoft SQL Server 和许多其他数据库的更改实时流式传输到 Kafka 中。 
Debezium 为变更日志提供了统一的格式结构,并支持使用 JSON 和 Apache Avro 序列化消息。
 
-Flink supports to interpret Debezium JSON messages as INSERT/UPDATE/DELETE 
messages into Flink SQL system. This is useful in many cases to leverage this 
feature, such as
- - synchronizing incremental data from databases to other systems
- - auditing logs
- - real-time materialized views on databases
- - temporal join changing history of a database table and so on.
+Flink 支持将 Debezium JSON 消息解释为 INSERT / UPDATE / DELETE 消息到 Flink SQL 
系统中。在很多情况下,利用这个特性非常的有用,例如
+ - 将增量数据从数据库同步到其他系统
+ - 审核日志
+ - 关于数据库的实时物化视图
+ - 临时联接更改数据库表的历史记录等等。
 
-*Note: Support for interpreting Debezium Avro messages and emitting Debezium 
messages is on the roadmap.*
+*注意: 路线图上支持解释 Debezium Avro 消息和发出 Debezium 消息。*
 
-Dependencies
+依赖
 ------------
 
-In order to setup the Debezium format, the following table provides dependency 
information for both projects using a build automation tool (such as Maven or 
SBT) and SQL Client with SQL JAR bundles.
+为了设置 Debezium 格式,下表提供了使用构建自动化工具(例如 Maven 或 SBT)和带有 SQL JAR 包的 SQL Client 
的两个项目的依赖项信息。
 
 | Maven dependency   | SQL Client JAR         |
 | :----------------- | :----------------------|
 | `flink-json`       | Built-in               |
 
-*Note: please refer to [Debezium 
documentation](https://debezium.io/documentation/reference/1.1/index.html) 
about how to setup a Debezium Kafka Connect to synchronize changelog to Kafka 
topics.*
+*注意: 请参考 [Debezium 
documentation](https://debezium.io/documentation/reference/1.1/index.html) 
文档,了解如何设置 Debezium Kafka Connect 用来将变更日志同步到 Kafka 主题。*

Review comment:
       ```suggestion
   *注意: 请参考 [Debezium 
文档](https://debezium.io/documentation/reference/1.1/index.html),了解如何设置 Debezium 
Kafka Connect 用来将变更日志同步到 Kafka 主题。*
   ```

##########
File path: docs/dev/table/connectors/formats/debezium.zh.md
##########
@@ -77,16 +78,15 @@ Debezium provides a unified format for changelog, here is a 
simple example for a
 }
 ```
 
-*Note: please refer to [Debezium 
documentation](https://debezium.io/documentation/reference/1.1/connectors/mysql.html#mysql-connector-events_debezium)
 about the meaning of each fields.*
+*注意: 请参考 [Debezium 
documentation](https://debezium.io/documentation/reference/1.1/connectors/mysql.html#mysql-connector-events_debezium)
 文档,了解每个字段的含义。*
 
-The MySQL `products` table has 4 columns (`id`, `name`, `description` and 
`weight`). The above JSON message is an update change event on the `products` 
table where the `weight` value of the row with `id = 111` is changed from 
`5.18` to `5.15`.
-Assuming this messages is synchronized to Kafka topic `products_binlog`, then 
we can use the following DDL to consume this topic and interpret the change 
events.
+MySQL 产品表有4列(id、name、description、weight)。上面的 JSON 消息是 products 表上的 update 
change 事件,其中 id = 111 的行的 weight 值从 5.18 更改为 5.15。假设此消息已同步到 Kafka 主题 
products_binlog,则可以使用以下 DDL 来使用此主题并解释更改事件。
 
 <div class="codetabs" markdown="1">
 <div data-lang="SQL" markdown="1">
 {% highlight sql %}
 CREATE TABLE topic_products (
-  -- schema is totally the same to the MySQL "products" table
+  -- 结构与 MySQL 的 products 表完全相同

Review comment:
       ```suggestion
     -- schema 与 MySQL 的 products 表完全相同
   ```

##########
File path: docs/dev/table/connectors/formats/debezium.zh.md
##########
@@ -128,27 +128,27 @@ In some cases, users may setup the Debezium Kafka Connect 
with the Kafka configu
 }
 ```
 
-In order to interpret such messages, you need to add the option 
`'debezium-json.schema-include' = 'true'` into above DDL WITH clause (`false` 
by default). Usually, this is not recommended to include schema because this 
makes the messages very verbose and reduces parsing performance.
+为了说明这一类信息,你需要在上述 DDL WITH 子句中添加选项'debezium-json.schema-include'='true'(默认为 
false)。通常情况下,建议不要包含结构的描述,因为这样会使消息变得非常冗长,并降低解析性能。

Review comment:
       ```suggestion
   为了解析这一类信息,你需要在上述 DDL WITH 子句中添加选项 
`'debezium-json.schema-include'='true'`(默认为 false)。通常情况下,建议不要包含 schema 
信息,因为这样会使消息变得非常冗长,并降低解析性能。
   ```

##########
File path: docs/dev/table/connectors/formats/debezium.zh.md
##########
@@ -200,8 +200,8 @@ Format Options
     </tbody>
 </table>
 
-Data Type Mapping
+数据类型映射
 ----------------
 
-Currently, the Debezium format uses JSON format for deserialization. Please 
refer to [JSON format documentation]({% link 
dev/table/connectors/formats/json.zh.md %}#data-type-mapping) for more details 
about the data type mapping.
+目前,Debezium 格式使用 JSON 格式进行反序列化。有关数据类型映射的更多详细信息,请参考 JSON 格式文档。[JSON format 
documentation]({% link /zh/dev/table/connectors/formats/json.zh.md 
%}#data-type-mapping)。

Review comment:
       ```suggestion
   目前,Debezium Format 使用 JSON Format 进行反序列化。有关数据类型映射的更多详细信息,请参考 [JSON Format 
文档]({% link /zh/dev/table/connectors/formats/json.zh.md %}#data-type-mapping)。
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to