morazow commented on code in PR #739:
URL: https://github.com/apache/flink-web/pull/739#discussion_r1588306116


##########
docs/content/posts/2024-MM-DD-release-cdc-3.1.0.md:
##########
@@ -0,0 +1,67 @@
+---
+title:  "Apache Flink CDC 3.1.0 Release Announcement"
+date: "2024-MM-DDT08:00:00.000Z"
+authors:
+- renqs:
+  name: "Qingsheng Ren"
+  twitter: "renqstuite"
+aliases:
+- /news/2024/MM/DD/release-cdc-3.1.0.html
+---
+
+The Apache Flink community is excited to announce the release of Flink CDC 
3.1.0! This is the first release after the community accepted the donation of 
Flink CDC as a sub-project of Apache Flink, with exciting new features such as 
transform and table merging. The eco-system of Flink CDC keeps expanding, 
including new Kafka and Paimon pipeline sinks and enhancement to existing 
connectors.
+
+We'd like to invite you to check out [Flink CDC 
documentation](https://nightlies.apache.org/flink/flink-cdc-docs-stable) and 
have a try on [the quickstart 
tutorial](https://nightlies.apache.org/flink/flink-cdc-docs-release-3.0/docs/get-started/introduction)
 to explore the world of Flink CDC. Also we encourage you to [download the 
release](https://flink.apache.org/downloads.html) and share your feedback with 
the community through the Flink [mailing 
lists](https://flink.apache.org/community.html#mailing-lists) or 
[JIRA](https://issues.apache.org/jira/browse/flink)! We hope you like the new 
release and we’d be eager to learn about your experience with it.
+
+## Highlights
+
+### Transformation Support in Pipeline
+
+Flink CDC 3.1.0 introduces the ability of making transformations in the CDC 
pipeline. By incorporating a `transform` section within the YAML pipeline 
definitions, users can now easily apply a variety of transformations to data 
change event from source, including projections, calculations, and addition of 
constant columns, enhancing the effectiveness of data intergration pipelines. 
Leveraging an SQL-like syntax for defining these transformations, the new 
feature ensures that users can quickly adapt to and utilize it.

Review Comment:
   ```suggestion
   Flink CDC 3.1.0 introduces the ability of making transformations in the CDC 
pipeline. By incorporating a `transform` section within the YAML pipeline 
definitions, users can now easily apply a variety of transformations to data 
change event from source, including projections, calculations, and addition of 
constant columns, enhancing the effectiveness of data integration pipelines. 
Leveraging an SQL-like syntax for defining these transformations, the new 
feature ensures that users can quickly adapt to and utilize it.
   ```



##########
docs/content/posts/2024-MM-DD-release-cdc-3.1.0.md:
##########
@@ -0,0 +1,67 @@
+---
+title:  "Apache Flink CDC 3.1.0 Release Announcement"
+date: "2024-MM-DDT08:00:00.000Z"
+authors:
+- renqs:
+  name: "Qingsheng Ren"
+  twitter: "renqstuite"
+aliases:
+- /news/2024/MM/DD/release-cdc-3.1.0.html
+---
+
+The Apache Flink community is excited to announce the release of Flink CDC 
3.1.0! This is the first release after the community accepted the donation of 
Flink CDC as a sub-project of Apache Flink, with exciting new features such as 
transform and table merging. The eco-system of Flink CDC keeps expanding, 
including new Kafka and Paimon pipeline sinks and enhancement to existing 
connectors.
+
+We'd like to invite you to check out [Flink CDC 
documentation](https://nightlies.apache.org/flink/flink-cdc-docs-stable) and 
have a try on [the quickstart 
tutorial](https://nightlies.apache.org/flink/flink-cdc-docs-release-3.0/docs/get-started/introduction)
 to explore the world of Flink CDC. Also we encourage you to [download the 
release](https://flink.apache.org/downloads.html) and share your feedback with 
the community through the Flink [mailing 
lists](https://flink.apache.org/community.html#mailing-lists) or 
[JIRA](https://issues.apache.org/jira/browse/flink)! We hope you like the new 
release and we’d be eager to learn about your experience with it.
+
+## Highlights
+
+### Transformation Support in Pipeline
+
+Flink CDC 3.1.0 introduces the ability of making transformations in the CDC 
pipeline. By incorporating a `transform` section within the YAML pipeline 
definitions, users can now easily apply a variety of transformations to data 
change event from source, including projections, calculations, and addition of 
constant columns, enhancing the effectiveness of data intergration pipelines. 
Leveraging an SQL-like syntax for defining these transformations, the new 
feature ensures that users can quickly adapt to and utilize it.
+
+### Table Merging Support
+
+Flink CDC 3.1.0 now suuports merging multiple tables into one by configuring 
`route` in the YAML pipeline definition.  It is a prevalent occurrence where 
business data is partitioned across tables even databases due to the 
substantial volume. By configuring `route`s that mapping multiple tables into 
one, data change events will be merged into the same destination table. 
Moreover, schema changes on source tables will also be applied to the 
destination.
+
+### Connectors
+
+#### Distributions of MySQL / Oracle / OceanBase / Db2 connectors
+
+Unfortunately due to the license incompatibility, we cannot ship JDBC drivers 
of the following connectors together with our binary release:
+
+- Db2
+- MySQL
+- Oracle
+- OceanBase
+
+Please manually download the corresponding JDBC driver into 
`$FLINK_CDC_HOME/lib` and `$FLINK_HOME/lib`, and specify their paths when 
submiting YAML pipelines with `--jar`, or make sure they are under the 
classpath if you are using Flink SQL.
+
+#### SinkFunction Support
+
+Although `SinkFunction` has been marked as deprecated in Flink, considering 
some connectors are still using the API, we also supports `SinkFunction` API 
for CDC pipeline sinks to help expanding the ecosystem of Flink CDC.
+
+#### New Pipeline Connectors
+
+Flink CDC 3.1.0 introduces 2 new pipeline connectors:
+
+- Apache Kafka sink
+- Apache Paimon sink
+
+#### MySQL
+
+In this release, MySQL pipeline source introduces a new option 
`tables.exclude` to exclude unnecessary tables from capturing with an easier 
expression. MySQL CDC sources is now shipped with a custom converter 
`MysqlDebeziumTimeConverter` for converting temporal type columns to a more 
human-readable and serialize-friendly string.
+
+#### OceanBase
+
+OceanBase CDC source now supports specifying the general 
`DebeziumDeserializationSchema` for reusing existing Debezium deserializers.
+
+#### Db2
+
+Db2 CDC source is now migrated to the unified incremental snapshot framework.
+
+### CLI
+
+Flink CDC pipeline submission CLI now supports recovering a pipeline exection 
from a specific savepoint file by using command line argument `--from-savepoint`

Review Comment:
   ```suggestion
   Flink CDC pipeline submission CLI now supports recovering a pipeline 
execution from a specific savepoint file by using command line argument 
`--from-savepoint`
   ```



##########
docs/content/posts/2024-MM-DD-release-cdc-3.1.0.md:
##########
@@ -0,0 +1,67 @@
+---
+title:  "Apache Flink CDC 3.1.0 Release Announcement"
+date: "2024-MM-DDT08:00:00.000Z"
+authors:
+- renqs:
+  name: "Qingsheng Ren"
+  twitter: "renqstuite"
+aliases:
+- /news/2024/MM/DD/release-cdc-3.1.0.html
+---
+
+The Apache Flink community is excited to announce the release of Flink CDC 
3.1.0! This is the first release after the community accepted the donation of 
Flink CDC as a sub-project of Apache Flink, with exciting new features such as 
transform and table merging. The eco-system of Flink CDC keeps expanding, 
including new Kafka and Paimon pipeline sinks and enhancement to existing 
connectors.
+
+We'd like to invite you to check out [Flink CDC 
documentation](https://nightlies.apache.org/flink/flink-cdc-docs-stable) and 
have a try on [the quickstart 
tutorial](https://nightlies.apache.org/flink/flink-cdc-docs-release-3.0/docs/get-started/introduction)
 to explore the world of Flink CDC. Also we encourage you to [download the 
release](https://flink.apache.org/downloads.html) and share your feedback with 
the community through the Flink [mailing 
lists](https://flink.apache.org/community.html#mailing-lists) or 
[JIRA](https://issues.apache.org/jira/browse/flink)! We hope you like the new 
release and we’d be eager to learn about your experience with it.
+
+## Highlights
+
+### Transformation Support in Pipeline
+
+Flink CDC 3.1.0 introduces the ability of making transformations in the CDC 
pipeline. By incorporating a `transform` section within the YAML pipeline 
definitions, users can now easily apply a variety of transformations to data 
change event from source, including projections, calculations, and addition of 
constant columns, enhancing the effectiveness of data intergration pipelines. 
Leveraging an SQL-like syntax for defining these transformations, the new 
feature ensures that users can quickly adapt to and utilize it.
+
+### Table Merging Support
+
+Flink CDC 3.1.0 now suuports merging multiple tables into one by configuring 
`route` in the YAML pipeline definition.  It is a prevalent occurrence where 
business data is partitioned across tables even databases due to the 
substantial volume. By configuring `route`s that mapping multiple tables into 
one, data change events will be merged into the same destination table. 
Moreover, schema changes on source tables will also be applied to the 
destination.
+
+### Connectors
+
+#### Distributions of MySQL / Oracle / OceanBase / Db2 connectors
+
+Unfortunately due to the license incompatibility, we cannot ship JDBC drivers 
of the following connectors together with our binary release:
+
+- Db2
+- MySQL
+- Oracle
+- OceanBase
+
+Please manually download the corresponding JDBC driver into 
`$FLINK_CDC_HOME/lib` and `$FLINK_HOME/lib`, and specify their paths when 
submiting YAML pipelines with `--jar`, or make sure they are under the 
classpath if you are using Flink SQL.
+
+#### SinkFunction Support
+
+Although `SinkFunction` has been marked as deprecated in Flink, considering 
some connectors are still using the API, we also supports `SinkFunction` API 
for CDC pipeline sinks to help expanding the ecosystem of Flink CDC.
+
+#### New Pipeline Connectors
+
+Flink CDC 3.1.0 introduces 2 new pipeline connectors:
+
+- Apache Kafka sink
+- Apache Paimon sink
+
+#### MySQL
+
+In this release, MySQL pipeline source introduces a new option 
`tables.exclude` to exclude unnecessary tables from capturing with an easier 
expression. MySQL CDC sources is now shipped with a custom converter 
`MysqlDebeziumTimeConverter` for converting temporal type columns to a more 
human-readable and serialize-friendly string.

Review Comment:
   ```suggestion
   In this release, MySQL pipeline source introduces a new option 
`tables.exclude` to exclude unnecessary tables from capturing with an easier 
expression. MySQL CDC source is now shipped with a custom converter 
`MysqlDebeziumTimeConverter` for converting temporal type columns to a more 
human-readable and serialize-friendly string.
   ```



##########
docs/content/posts/2024-MM-DD-release-cdc-3.1.0.md:
##########
@@ -0,0 +1,67 @@
+---
+title:  "Apache Flink CDC 3.1.0 Release Announcement"
+date: "2024-MM-DDT08:00:00.000Z"
+authors:
+- renqs:
+  name: "Qingsheng Ren"
+  twitter: "renqstuite"
+aliases:
+- /news/2024/MM/DD/release-cdc-3.1.0.html
+---
+
+The Apache Flink community is excited to announce the release of Flink CDC 
3.1.0! This is the first release after the community accepted the donation of 
Flink CDC as a sub-project of Apache Flink, with exciting new features such as 
transform and table merging. The eco-system of Flink CDC keeps expanding, 
including new Kafka and Paimon pipeline sinks and enhancement to existing 
connectors.
+
+We'd like to invite you to check out [Flink CDC 
documentation](https://nightlies.apache.org/flink/flink-cdc-docs-stable) and 
have a try on [the quickstart 
tutorial](https://nightlies.apache.org/flink/flink-cdc-docs-release-3.0/docs/get-started/introduction)
 to explore the world of Flink CDC. Also we encourage you to [download the 
release](https://flink.apache.org/downloads.html) and share your feedback with 
the community through the Flink [mailing 
lists](https://flink.apache.org/community.html#mailing-lists) or 
[JIRA](https://issues.apache.org/jira/browse/flink)! We hope you like the new 
release and we’d be eager to learn about your experience with it.
+
+## Highlights
+
+### Transformation Support in Pipeline
+
+Flink CDC 3.1.0 introduces the ability of making transformations in the CDC 
pipeline. By incorporating a `transform` section within the YAML pipeline 
definitions, users can now easily apply a variety of transformations to data 
change event from source, including projections, calculations, and addition of 
constant columns, enhancing the effectiveness of data intergration pipelines. 
Leveraging an SQL-like syntax for defining these transformations, the new 
feature ensures that users can quickly adapt to and utilize it.
+
+### Table Merging Support
+
+Flink CDC 3.1.0 now suuports merging multiple tables into one by configuring 
`route` in the YAML pipeline definition.  It is a prevalent occurrence where 
business data is partitioned across tables even databases due to the 
substantial volume. By configuring `route`s that mapping multiple tables into 
one, data change events will be merged into the same destination table. 
Moreover, schema changes on source tables will also be applied to the 
destination.
+
+### Connectors
+
+#### Distributions of MySQL / Oracle / OceanBase / Db2 connectors
+
+Unfortunately due to the license incompatibility, we cannot ship JDBC drivers 
of the following connectors together with our binary release:
+
+- Db2
+- MySQL
+- Oracle
+- OceanBase
+
+Please manually download the corresponding JDBC driver into 
`$FLINK_CDC_HOME/lib` and `$FLINK_HOME/lib`, and specify their paths when 
submiting YAML pipelines with `--jar`, or make sure they are under the 
classpath if you are using Flink SQL.

Review Comment:
   ```suggestion
   Please manually download the corresponding JDBC driver into 
`$FLINK_CDC_HOME/lib` and `$FLINK_HOME/lib`, and specify their paths when 
submitting YAML pipelines with `--jar`, or make sure they are under the 
classpath if you are using Flink SQL.
   ```



##########
docs/content/posts/2024-MM-DD-release-cdc-3.1.0.md:
##########
@@ -0,0 +1,67 @@
+---
+title:  "Apache Flink CDC 3.1.0 Release Announcement"
+date: "2024-MM-DDT08:00:00.000Z"
+authors:
+- renqs:
+  name: "Qingsheng Ren"
+  twitter: "renqstuite"
+aliases:
+- /news/2024/MM/DD/release-cdc-3.1.0.html
+---
+
+The Apache Flink community is excited to announce the release of Flink CDC 
3.1.0! This is the first release after the community accepted the donation of 
Flink CDC as a sub-project of Apache Flink, with exciting new features such as 
transform and table merging. The eco-system of Flink CDC keeps expanding, 
including new Kafka and Paimon pipeline sinks and enhancement to existing 
connectors.
+
+We'd like to invite you to check out [Flink CDC 
documentation](https://nightlies.apache.org/flink/flink-cdc-docs-stable) and 
have a try on [the quickstart 
tutorial](https://nightlies.apache.org/flink/flink-cdc-docs-release-3.0/docs/get-started/introduction)
 to explore the world of Flink CDC. Also we encourage you to [download the 
release](https://flink.apache.org/downloads.html) and share your feedback with 
the community through the Flink [mailing 
lists](https://flink.apache.org/community.html#mailing-lists) or 
[JIRA](https://issues.apache.org/jira/browse/flink)! We hope you like the new 
release and we’d be eager to learn about your experience with it.
+
+## Highlights
+
+### Transformation Support in Pipeline
+
+Flink CDC 3.1.0 introduces the ability of making transformations in the CDC 
pipeline. By incorporating a `transform` section within the YAML pipeline 
definitions, users can now easily apply a variety of transformations to data 
change event from source, including projections, calculations, and addition of 
constant columns, enhancing the effectiveness of data intergration pipelines. 
Leveraging an SQL-like syntax for defining these transformations, the new 
feature ensures that users can quickly adapt to and utilize it.
+
+### Table Merging Support
+
+Flink CDC 3.1.0 now suuports merging multiple tables into one by configuring 
`route` in the YAML pipeline definition.  It is a prevalent occurrence where 
business data is partitioned across tables even databases due to the 
substantial volume. By configuring `route`s that mapping multiple tables into 
one, data change events will be merged into the same destination table. 
Moreover, schema changes on source tables will also be applied to the 
destination.
+
+### Connectors
+
+#### Distributions of MySQL / Oracle / OceanBase / Db2 connectors
+
+Unfortunately due to the license incompatibility, we cannot ship JDBC drivers 
of the following connectors together with our binary release:
+
+- Db2
+- MySQL
+- Oracle
+- OceanBase
+
+Please manually download the corresponding JDBC driver into 
`$FLINK_CDC_HOME/lib` and `$FLINK_HOME/lib`, and specify their paths when 
submiting YAML pipelines with `--jar`, or make sure they are under the 
classpath if you are using Flink SQL.
+
+#### SinkFunction Support
+
+Although `SinkFunction` has been marked as deprecated in Flink, considering 
some connectors are still using the API, we also supports `SinkFunction` API 
for CDC pipeline sinks to help expanding the ecosystem of Flink CDC.

Review Comment:
   ```suggestion
   Although `SinkFunction` has been marked as deprecated in Flink, considering 
some connectors are still using the API, we also support `SinkFunction` API for 
CDC pipeline sinks to help expand the ecosystem of Flink CDC.
   ```



##########
docs/content/posts/2024-MM-DD-release-cdc-3.1.0.md:
##########
@@ -0,0 +1,67 @@
+---
+title:  "Apache Flink CDC 3.1.0 Release Announcement"
+date: "2024-MM-DDT08:00:00.000Z"
+authors:
+- renqs:
+  name: "Qingsheng Ren"
+  twitter: "renqstuite"
+aliases:
+- /news/2024/MM/DD/release-cdc-3.1.0.html
+---
+
+The Apache Flink community is excited to announce the release of Flink CDC 
3.1.0! This is the first release after the community accepted the donation of 
Flink CDC as a sub-project of Apache Flink, with exciting new features such as 
transform and table merging. The eco-system of Flink CDC keeps expanding, 
including new Kafka and Paimon pipeline sinks and enhancement to existing 
connectors.
+
+We'd like to invite you to check out [Flink CDC 
documentation](https://nightlies.apache.org/flink/flink-cdc-docs-stable) and 
have a try on [the quickstart 
tutorial](https://nightlies.apache.org/flink/flink-cdc-docs-release-3.0/docs/get-started/introduction)
 to explore the world of Flink CDC. Also we encourage you to [download the 
release](https://flink.apache.org/downloads.html) and share your feedback with 
the community through the Flink [mailing 
lists](https://flink.apache.org/community.html#mailing-lists) or 
[JIRA](https://issues.apache.org/jira/browse/flink)! We hope you like the new 
release and we’d be eager to learn about your experience with it.
+
+## Highlights
+
+### Transformation Support in Pipeline
+
+Flink CDC 3.1.0 introduces the ability of making transformations in the CDC 
pipeline. By incorporating a `transform` section within the YAML pipeline 
definitions, users can now easily apply a variety of transformations to data 
change event from source, including projections, calculations, and addition of 
constant columns, enhancing the effectiveness of data intergration pipelines. 
Leveraging an SQL-like syntax for defining these transformations, the new 
feature ensures that users can quickly adapt to and utilize it.
+
+### Table Merging Support
+
+Flink CDC 3.1.0 now suuports merging multiple tables into one by configuring 
`route` in the YAML pipeline definition.  It is a prevalent occurrence where 
business data is partitioned across tables even databases due to the 
substantial volume. By configuring `route`s that mapping multiple tables into 
one, data change events will be merged into the same destination table. 
Moreover, schema changes on source tables will also be applied to the 
destination.

Review Comment:
   ```suggestion
   Flink CDC 3.1.0 now supports merging multiple tables into one by configuring 
`route` in the YAML pipeline definition.  It is a prevalent occurrence where 
business data is partitioned across tables even databases due to the 
substantial volume. By configuring `route`s that map multiple tables into one, 
data change events will be merged into the same destination table. Moreover, 
schema changes on source tables will also be applied to the destination.
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to