This is an automated email from the ASF dual-hosted git repository.
rmetzger pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git
The following commit(s) were added to refs/heads/master by this push:
new ce171b40a44 [hotfix] [docs] Typo fix in several docs (#26432)
ce171b40a44 is described below
commit ce171b40a442c3c1236a98cfa5a3ff3f7aa802e6
Author: Gunnar Morling <[email protected]>
AuthorDate: Thu Apr 10 09:21:22 2025 +0200
[hotfix] [docs] Typo fix in several docs (#26432)
---
docs/content.zh/docs/dev/table/concepts/overview.md | 4 ++--
docs/content/docs/dev/table/concepts/overview.md | 4 ++--
flink-python/docs/reference/pyflink.datastream/connectors.rst | 2 +-
3 files changed, 5 insertions(+), 5 deletions(-)
diff --git a/docs/content.zh/docs/dev/table/concepts/overview.md
b/docs/content.zh/docs/dev/table/concepts/overview.md
index 6203dc4bc62..a0c3ae0b221 100644
--- a/docs/content.zh/docs/dev/table/concepts/overview.md
+++ b/docs/content.zh/docs/dev/table/concepts/overview.md
@@ -96,7 +96,7 @@ GROUP BY word;
下面的例子展示了使用 `SELECT ... FROM` 语句查询 [upsert kafka 源表]({{< ref
"docs/connectors/table/upsert-kafka" >}})。
```sql
-CREATE TABLE upsert_kakfa (
+CREATE TABLE upsert_kafka (
id INT PRIMARY KEY NOT ENFORCED,
message STRING
) WITH (
@@ -104,7 +104,7 @@ CREATE TABLE upsert_kakfa (
...
);
-SELECT * FROM upsert_kakfa;
+SELECT * FROM upsert_kafka;
```
源表的消息类型只包含 *INSERT*,*UPDATE_AFTER* 和 *DELETE*,然而下游要求完整的 changelog(包含
*UPDATE_BEFORE*)。
所以虽然查询本身没有包含状态计算,但是优化器依然隐式地推导出了一个 ChangelogNormalize 状态算子来生成完整的 changelog。
diff --git a/docs/content/docs/dev/table/concepts/overview.md
b/docs/content/docs/dev/table/concepts/overview.md
index dc1596f1e7d..3f3649606e5 100644
--- a/docs/content/docs/dev/table/concepts/overview.md
+++ b/docs/content/docs/dev/table/concepts/overview.md
@@ -103,7 +103,7 @@ or through user configuration (see
[`table-exec-source-cdc-events-duplicate`]({{
The following figure illustrates a `SELECT ... FROM` statement that querying
an [upsert kafka source]({{< ref "docs/connectors/table/upsert-kafka" >}}).
```sql
-CREATE TABLE upsert_kakfa (
+CREATE TABLE upsert_kafka (
id INT PRIMARY KEY NOT ENFORCED,
message STRING
) WITH (
@@ -111,7 +111,7 @@ CREATE TABLE upsert_kakfa (
...
);
-SELECT * FROM upsert_kakfa;
+SELECT * FROM upsert_kafka;
```
The table source only provides messages with *INSERT*, *UPDATE_AFTER* and
*DELETE* type, while the downstream sink requires a complete changelog
(including *UPDATE_BEFORE*).
As a result, although this query itself does not involve explicit stateful
calculation, the planner still generates a stateful operator called
"ChangelogNormalize" to help obtain the complete changelog.
diff --git a/flink-python/docs/reference/pyflink.datastream/connectors.rst
b/flink-python/docs/reference/pyflink.datastream/connectors.rst
index 6c0c7bc2e07..6b12aca802a 100644
--- a/flink-python/docs/reference/pyflink.datastream/connectors.rst
+++ b/flink-python/docs/reference/pyflink.datastream/connectors.rst
@@ -73,7 +73,7 @@ Number Sequence
Kafka
=====
-Kakfa Producer and Consumer
+Kafka Producer and Consumer
---------------------------
.. currentmodule:: pyflink.datastream.connectors.kafka