This is an automated email from the ASF dual-hosted git repository.
xuba pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/amoro.git
The following commit(s) were added to refs/heads/master by this push:
new ab8a1963f [AMORO-4153]: Drop mixed Flink 1.17 support (#4154)
ab8a1963f is described below
commit ab8a1963f2276a73b515a86d0184dc37363876c3
Author: Jiwon Park <[email protected]>
AuthorDate: Tue Mar 31 20:31:14 2026 +0900
[AMORO-4153]: Drop mixed Flink 1.17 support (#4154)
* Drop mixed Flink 1.17 support
Flink 1.17 support is removed to enable upgrading Iceberg from 1.6.x
to 1.7.x, as Iceberg 1.7.0 dropped the iceberg-flink-1.17 artifact.
After this change, only Flink 1.18 is supported for mixed format.
- Remove v1.17 module directory (amoro-mixed-flink-1.17,
amoro-mixed-flink-runtime-1.17)
- Remove v1.17 module entries from parent pom.xml
- Update all documentation references from Flink 1.17 to 1.18
- Update Iceberg source reference in MixedCatalog.java
* Update AMORO_VERSION to 0.9.0-incubating in flink-get-started.md
Signed-off-by: Jiwon Park <[email protected]>
Co-authored-by: Xu Bai <[email protected]>
---
README.md | 2 +-
.../apache/amoro/flink/catalog/MixedCatalog.java | 2 +-
amoro-format-mixed/amoro-mixed-flink/pom.xml | 2 -
.../v1.17/amoro-mixed-flink-1.17/pom.xml | 99 ---------
.../v1.17/amoro-mixed-flink-runtime-1.17/pom.xml | 229 ---------------------
docs/_index.md | 2 +-
docs/admin-guides/deployment.md | 4 +-
docs/engines/flink/flink-cdc-ingestion.md | 2 +-
docs/engines/flink/flink-dml.md | 6 +-
docs/engines/flink/flink-ds.md | 2 +-
docs/engines/flink/flink-get-started.md | 16 +-
docs/engines/flink/using-logstore.md | 2 -
docs/user-guides/configurations.md | 2 +-
13 files changed, 19 insertions(+), 351 deletions(-)
diff --git a/README.md b/README.md
index 1d15247aa..e07afe7b0 100644
--- a/README.md
+++ b/README.md
@@ -82,7 +82,7 @@ Amoro support multiple processing engines for Mixed format as
below:
| Processing Engine | Version | Batch Read | Batch Write |
Batch Overwrite | Streaming Read | Streaming Write | Create Table | Alter Table
|
|-------------------|------------------------|-------------|-------------|-----------------|----------------|-----------------|--------------|-------------|
-| Flink | 1.17.x, 1.18.x, 1.19.x | ✔ | ✔ |
✖ | ✔ | ✔ | ✔ | ✖
|
+| Flink | 1.18.x, 1.19.x | ✔ | ✔ |
✖ | ✔ | ✔ | ✔ |
✖ |
| Spark | 3.3, 3.4, 3.5 | ✔ | ✔ |
✔ | ✖ | ✖ | ✔ | ✔
|
| Hive | 2.x, 3.x | ✔ | ✖ |
✔ | ✖ | ✖ | ✖ | ✔ |
| Trino | 406 | ✔ | ✖ |
✔ | ✖ | ✖ | ✖ | ✔ |
diff --git
a/amoro-format-mixed/amoro-mixed-flink/amoro-mixed-flink-common/src/main/java/org/apache/amoro/flink/catalog/MixedCatalog.java
b/amoro-format-mixed/amoro-mixed-flink/amoro-mixed-flink-common/src/main/java/org/apache/amoro/flink/catalog/MixedCatalog.java
index f4f4bb697..3633d8ed4 100644
---
a/amoro-format-mixed/amoro-mixed-flink/amoro-mixed-flink-common/src/main/java/org/apache/amoro/flink/catalog/MixedCatalog.java
+++
b/amoro-format-mixed/amoro-mixed-flink/amoro-mixed-flink-common/src/main/java/org/apache/amoro/flink/catalog/MixedCatalog.java
@@ -665,7 +665,7 @@ public class MixedCatalog extends AbstractCatalog {
/**
* copy from
- *
https://github.com/apache/iceberg/blob/1.6.x/flink/v1.17/flink/src/main/java/org/apache/iceberg/flink/FlinkCatalog.java#L425-L448
+ *
https://github.com/apache/iceberg/blob/1.6.x/flink/v1.18/flink/src/main/java/org/apache/iceberg/flink/FlinkCatalog.java#L425-L448
*
* @param ct1 CatalogTable before
* @param ct2 CatalogTable after
diff --git a/amoro-format-mixed/amoro-mixed-flink/pom.xml
b/amoro-format-mixed/amoro-mixed-flink/pom.xml
index 399691563..87ace106d 100644
--- a/amoro-format-mixed/amoro-mixed-flink/pom.xml
+++ b/amoro-format-mixed/amoro-mixed-flink/pom.xml
@@ -36,8 +36,6 @@
<module>amoro-mixed-flink-common</module>
<module>amoro-mixed-flink-common-format</module>
<module>amoro-mixed-flink-common-iceberg-bridge</module>
- <module>v1.17/amoro-mixed-flink-1.17</module>
- <module>v1.17/amoro-mixed-flink-runtime-1.17</module>
<module>v1.18/amoro-mixed-flink-1.18</module>
<module>v1.18/amoro-mixed-flink-runtime-1.18</module>
<module>v1.19/amoro-mixed-flink-1.19</module>
diff --git
a/amoro-format-mixed/amoro-mixed-flink/v1.17/amoro-mixed-flink-1.17/pom.xml
b/amoro-format-mixed/amoro-mixed-flink/v1.17/amoro-mixed-flink-1.17/pom.xml
deleted file mode 100644
index 2e349a5a5..000000000
--- a/amoro-format-mixed/amoro-mixed-flink/v1.17/amoro-mixed-flink-1.17/pom.xml
+++ /dev/null
@@ -1,99 +0,0 @@
-<?xml version="1.0" encoding="UTF-8"?>
-<!--
- ~ Licensed to the Apache Software Foundation (ASF) under one
- ~ or more contributor license agreements. See the NOTICE file
- ~ distributed with this work for additional information
- ~ regarding copyright ownership. The ASF licenses this file
- ~ to you under the Apache License, Version 2.0 (the
- ~ "License"); you may not use this file except in compliance
- ~ with the License. You may obtain a copy of the License at
- ~
- ~ http://www.apache.org/licenses/LICENSE-2.0
- ~
- ~ Unless required by applicable law or agreed to in writing, software
- ~ distributed under the License is distributed on an "AS IS" BASIS,
- ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- ~ See the License for the specific language governing permissions and
- ~ limitations under the License.
- -->
-<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
- xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
- <modelVersion>4.0.0</modelVersion>
- <parent>
- <groupId>org.apache.amoro</groupId>
- <artifactId>amoro-mixed-flink</artifactId>
- <version>0.9-SNAPSHOT</version>
- <relativePath>../../pom.xml</relativePath>
- </parent>
-
- <artifactId>amoro-format-mixed-flink-1.17</artifactId>
-
- <packaging>jar</packaging>
- <name>Amoro Project Mixed Format Flink 1.17</name>
- <url>https://amoro.apache.org</url>
-
- <properties>
- <kafka.version>3.2.3</kafka.version>
- <assertj.version>3.21.0</assertj.version>
- <flink.version>1.17.2</flink.version>
- </properties>
-
- <dependencies>
- <dependency>
- <groupId>org.apache.amoro</groupId>
- <artifactId>amoro-mixed-flink-common</artifactId>
- <version>${project.parent.version}</version>
- </dependency>
-
- <dependency>
- <groupId>org.apache.iceberg</groupId>
- <artifactId>iceberg-flink-1.17</artifactId>
- <version>${iceberg.version}</version>
- <exclusions>
- <exclusion>
- <groupId>org.slf4j</groupId>
- <artifactId>slf4j-api</artifactId>
- </exclusion>
- <exclusion>
- <groupId>org.apache.parquet</groupId>
- <artifactId>parquet-column</artifactId>
- </exclusion>
- <exclusion>
- <groupId>org.apache.parquet</groupId>
- <artifactId>parquet-avro</artifactId>
- </exclusion>
- </exclusions>
- </dependency>
-
- <dependency>
- <groupId>org.apache.paimon</groupId>
- <artifactId>paimon-flink-1.17</artifactId>
- <version>${paimon.version}</version>
- </dependency>
- </dependencies>
-
- <build>
- <plugins>
- <plugin>
- <groupId>org.apache.maven.plugins</groupId>
- <artifactId>maven-shade-plugin</artifactId>
- <executions>
- <execution>
- <id>shade-amoro</id>
- <goals>
- <goal>shade</goal>
- </goals>
- <phase>package</phase>
- <configuration>
- <artifactSet>
- <includes combine.children="append">
-
<include>org.apache.amoro:amoro-format-mixed-flink-common</include>
- </includes>
- </artifactSet>
- </configuration>
- </execution>
- </executions>
- </plugin>
- </plugins>
- </build>
-</project>
diff --git
a/amoro-format-mixed/amoro-mixed-flink/v1.17/amoro-mixed-flink-runtime-1.17/pom.xml
b/amoro-format-mixed/amoro-mixed-flink/v1.17/amoro-mixed-flink-runtime-1.17/pom.xml
deleted file mode 100644
index 4d5186dc3..000000000
---
a/amoro-format-mixed/amoro-mixed-flink/v1.17/amoro-mixed-flink-runtime-1.17/pom.xml
+++ /dev/null
@@ -1,229 +0,0 @@
-<?xml version="1.0" encoding="UTF-8"?>
-<!--
- ~ Licensed to the Apache Software Foundation (ASF) under one
- ~ or more contributor license agreements. See the NOTICE file
- ~ distributed with this work for additional information
- ~ regarding copyright ownership. The ASF licenses this file
- ~ to you under the Apache License, Version 2.0 (the
- ~ "License"); you may not use this file except in compliance
- ~ with the License. You may obtain a copy of the License at
- ~
- ~ http://www.apache.org/licenses/LICENSE-2.0
- ~
- ~ Unless required by applicable law or agreed to in writing, software
- ~ distributed under the License is distributed on an "AS IS" BASIS,
- ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- ~ See the License for the specific language governing permissions and
- ~ limitations under the License.
- -->
-<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
- xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
- <modelVersion>4.0.0</modelVersion>
- <parent>
- <groupId>org.apache.amoro</groupId>
- <artifactId>amoro-mixed-flink</artifactId>
- <version>0.9-SNAPSHOT</version>
- <relativePath>../../pom.xml</relativePath>
- </parent>
-
- <artifactId>amoro-format-mixed-flink-runtime-1.17</artifactId>
- <name>Amoro Project Mixed Format Flink 1.17 Runtime</name>
- <url>https://amoro.apache.org</url>
-
- <properties>
- <flink.version>1.17.2</flink.version>
- </properties>
-
- <dependencies>
- <dependency>
- <groupId>org.apache.amoro</groupId>
- <artifactId>amoro-format-mixed-flink-1.17</artifactId>
- <version>${project.parent.version}</version>
- </dependency>
-
- <dependency>
- <groupId>org.apache.flink</groupId>
- <artifactId>flink-connector-kafka</artifactId>
- <version>${flink.version}</version>
- <exclusions>
- <exclusion>
- <groupId>com.github.luben</groupId>
- <artifactId>zstd-jni</artifactId>
- </exclusion>
- </exclusions>
- </dependency>
-
- <dependency>
- <groupId>org.apache.kafka</groupId>
- <artifactId>kafka-clients</artifactId>
- <version>${kafka.version}</version>
- <exclusions>
- <exclusion>
- <groupId>com.github.luben</groupId>
- <artifactId>zstd-jni</artifactId>
- </exclusion>
- </exclusions>
- </dependency>
- </dependencies>
-
- <build>
- <plugins>
- <plugin>
- <groupId>org.apache.maven.plugins</groupId>
- <artifactId>maven-shade-plugin</artifactId>
- <executions>
- <execution>
- <id>shade-amoro</id>
- <goals>
- <goal>shade</goal>
- </goals>
- <phase>package</phase>
- <configuration>
-
<createDependencyReducedPom>false</createDependencyReducedPom>
- <artifactSet>
- <includes combine.children="append">
- <include>org.apache.amoro:*</include>
- <include>org.apache.iceberg:*</include>
-
<include>com.fasterxml.jackson.core:*</include>
- <include>org.apache.parquet:*</include>
- <include>org.apache.commons:*</include>
- <include>commons-lang:*</include>
-
<include>com.github.ben-manes.caffeine:*</include>
- <include>org.apache.avro:*</include>
- <include>org.apache.orc:*</include>
- <include>io.airlift:*</include>
- <include>commons-collections:*</include>
- <include>cglib:*</include>
- <include>com.google.guava:*</include>
- <include>asm:*</include>
-
<include>org.apache.httpcomponents.client5:*</include>
-
<include>org.apache.httpcomponents.core5:*</include>
-
<include>org.apache.flink:flink-connector-kafka</include>
- <include>org.apache.kafka:*</include>
- <include>com.github.luben:*</include>
- <include>com.github.luben:*</include>
- </includes>
- </artifactSet>
-
- <relocations>
- <relocation>
- <pattern>org.apache.iceberg</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.iceberg</shadedPattern>
- <excludes>
-
<exclude>org.apache.iceberg.mr.hive.*</exclude>
- </excludes>
- </relocation>
-
- <relocation>
- <pattern>org.apache.parquet</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.parquet</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>org.apache.commons</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.commons</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>org.apache.avro</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.avro</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>org.apache.orc</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.orc</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>org.apache.hc</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.hc</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>org.apache.jute</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.jute</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>org.apache.kafka</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.kafka</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>shaded.parquet</pattern>
-
<shadedPattern>org.apache.amoro.shade.shaded.parquet</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>com.fasterxml</pattern>
-
<shadedPattern>org.apache.amoro.shade.com.fasterxml</shadedPattern>
- </relocation>
-
- <relocation>
-
<pattern>com.github.benmanes.caffeine</pattern>
-
<shadedPattern>org.apache.amoro.shade.com.github.benmanes.caffeine</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>org.threeten.extra</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.threeten.extra</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>net.sf.cglib</pattern>
-
<shadedPattern>org.apache.amoro.shade.net.sf.cglib</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>com.google</pattern>
-
<shadedPattern>org.apache.amoro.shade.com.google</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>org.objectweb.asm</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.objectweb.asm</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>com.facebook.fb303</pattern>
-
<shadedPattern>org.apache.amoro.shade.com.facebook.fb303</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>io.airlift</pattern>
-
<shadedPattern>org.apache.amoro.shade.io.airlift</shadedPattern>
- </relocation>
-
- <!-- flink-sql-connector-kafka -->
- <relocation>
-
<pattern>org.apache.flink.connector.kafka</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.flink.connector.kafka</shadedPattern>
- </relocation>
- <relocation>
-
<pattern>org.apache.flink.streaming.connectors.kafka</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.flink.streaming.connectors.kafka</shadedPattern>
- </relocation>
- <relocation>
-
<pattern>org.apache.flink.streaming.util.serialization.JSONKeyValueDeserializationSchema</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.flink.streaming.util.serialization.JSONKeyValueDeserializationSchema</shadedPattern>
- </relocation>
- <relocation>
-
<pattern>org.apache.flink.streaming.util.serialization.KeyedDeserializationSchema</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.flink.streaming.util.serialization.KeyedDeserializationSchema</shadedPattern>
- </relocation>
- <relocation>
-
<pattern>org.apache.flink.streaming.util.serialization.KeyedSerializationSchema</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.flink.streaming.util.serialization.KeyedSerializationSchema</shadedPattern>
- </relocation>
- <relocation>
-
<pattern>org.apache.flink.streaming.util.serialization.TypeInformationKeyValueSerializationSchema</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.flink.streaming.util.serialization.TypeInformationKeyValueSerializationSchema</shadedPattern>
- </relocation>
- </relocations>
- </configuration>
- </execution>
- </executions>
- </plugin>
- </plugins>
- </build>
-</project>
diff --git a/docs/_index.md b/docs/_index.md
index 7c3a7a4b3..4d373889d 100644
--- a/docs/_index.md
+++ b/docs/_index.md
@@ -69,7 +69,7 @@ Amoro support multiple processing engines for Mixed format as
below:
| Processing Engine | Version | Batch Read | Batch Write |
Batch Overwrite | Streaming Read | Streaming Write | Create Table | Alter Table
|
|-------------------|------------------------|-------------|-------------|-----------------|----------------|-----------------|--------------|-------------|
-| Flink | 1.17.x, 1.18.x, 1.19.x | ✔ | ✔ |
✖ | ✔ | ✔ | ✔ | ✖
|
+| Flink | 1.18.x, 1.19.x | ✔ | ✔ |
✖ | ✔ | ✔ | ✔ |
✖ |
| Spark | 3.3, 3.4, 3.5 | ✔ | ✔ |
✔ | ✖ | ✖ | ✔ | ✔
|
| Hive | 2.x, 3.x | ✔ | ✖ |
✔ | ✖ | ✖ | ✖ | ✔ |
| Trino | 406 | ✔ | ✖ |
✔ | ✖ | ✖ | ✖ | ✔ |
diff --git a/docs/admin-guides/deployment.md b/docs/admin-guides/deployment.md
index 5a185c01d..57f131533 100644
--- a/docs/admin-guides/deployment.md
+++ b/docs/admin-guides/deployment.md
@@ -53,9 +53,9 @@ $ cd dist/target/
$ ls
amoro-x.y.z-bin.zip # AMS release package
-$ cd
${base_dir}/amoro-format-mixed/amoro-mixed-flink/v1.17/amoro-mixed-flink-runtime-1.17/target
+$ cd
${base_dir}/amoro-format-mixed/amoro-mixed-flink/v1.18/amoro-mixed-flink-runtime-1.18/target
$ ls
-amoro-format-mixed-flink-runtime-1.17-x.y.z.jar # Flink 1.17 runtime package
+amoro-format-mixed-flink-runtime-1.18-x.y.z.jar # Flink 1.18 runtime package
$ cd
${base_dir}/amoro-format-mixed/amoro-mixed-spark/v3.3/amoro-mixed-spark-runtime-3.3/target
$ ls
diff --git a/docs/engines/flink/flink-cdc-ingestion.md
b/docs/engines/flink/flink-cdc-ingestion.md
index bf1df9b11..09a67d2c1 100644
--- a/docs/engines/flink/flink-cdc-ingestion.md
+++ b/docs/engines/flink/flink-cdc-ingestion.md
@@ -341,7 +341,7 @@ The following example will show how to write CDC data from
multiple MySQL tables
**Requirements**
-Please add [Flink Connector MySQL
CDC](https://mvnrepository.com/artifact/org.apache.flink/flink-connector-mysql-cdc/3.1.1)
and
[Amoro](https://mvnrepository.com/artifact/org.apache.amoro/amoro-format-mixed-flink-1.17/0.7.0-incubating)
dependencies to your Maven project's pom.xml file.
+Please add [Flink Connector MySQL
CDC](https://mvnrepository.com/artifact/org.apache.flink/flink-connector-mysql-cdc/3.1.1)
and
[Amoro](https://mvnrepository.com/artifact/org.apache.amoro/amoro-format-mixed-flink-1.18/0.9.0-incubating)
dependencies to your Maven project's pom.xml file.
```java
import org.apache.amoro.flink.InternalCatalogBuilder;
diff --git a/docs/engines/flink/flink-dml.md b/docs/engines/flink/flink-dml.md
index c226704a3..2c5896d7f 100644
--- a/docs/engines/flink/flink-dml.md
+++ b/docs/engines/flink/flink-dml.md
@@ -108,13 +108,13 @@ The following Hint Options are supported:
| scan.startup.specific-offsets | (none) | String | No
| specify offsets for each
partition in case of 'specific-offsets' startup mode, e.g.
'partition:0,offset:42;partition:1,offset:300'.
[...]
| properties.group.id | (none) | String | If the
LogStore for an Amoro table is Kafka, it is mandatory to provide its details
while querying the table. Otherwise, it can be left empty. | The group id used
to read the Kafka Topic
[...]
| properties.pulsar.admin.adminUrl | (none) | String | Required if
LogStore is pulsar, otherwise not required
| Pulsar admin's HTTP URL,
e.g. http://my-broker.example.com:8080
[...]
-| properties.* | (none) | String | No
| Parameters for Logstore:
<br>For Logstore with Kafka ('log-store.type'='kafka' default value), all other
parameters supported by the Kafka Consumer can be set by prefixing properties.
to the parameter name, for example, 'properties.batch.size'='16384'. The
complete parameter inform [...]
+| properties.* | (none) | String | No
| Parameters for Logstore:
<br>For Logstore with Kafka ('log-store.type'='kafka' default value), all other
parameters supported by the Kafka Consumer can be set by prefixing properties.
to the parameter name, for example, 'properties.batch.size'='16384'. The
complete parameter inform [...]
| log.consumer.changelog.modes | all-kinds | String | No
| The type of RowKind that
will be generated when reading log data, supports: all-kinds,
append-only.<br>all-kinds: will read cdc data, including
+I/-D/-U/+U;<br>append-only: will only generate Insert data, recommended to use
this configuration when reading without primary key. [...]
> **Notes**
>
> - When log-store.type = pulsar, the parallelism of the Flink task cannot be
> less than the number of partitions in the Pulsar topic, otherwise some
> partition data cannot be read.
-> - When the number of topic partitions in log-store is less than the
parallelism of the Flink task, some Flink subtasks will be idle. At this time,
if the task has a watermark, the parameter table.exec.source.idle-timeout must
be configured, otherwise the watermark will not advance. See [official
documentation](https://nightlies.apache.org/flink/flink-docs-release-1.17/docs/dev/table/config/#table-exec-source-idle-timeout)
for details.
+> - When the number of topic partitions in log-store is less than the
parallelism of the Flink task, some Flink subtasks will be idle. At this time,
if the task has a watermark, the parameter table.exec.source.idle-timeout must
be configured, otherwise the watermark will not advance. See [official
documentation](https://nightlies.apache.org/flink/flink-docs-release-1.18/docs/dev/table/config/#table-exec-source-idle-timeout)
for details.
### Streaming mode (FileStore non-primary key table)
@@ -204,7 +204,7 @@ Hint Options
| write.distribution-mode | hash | String
| No
| The distribution modes for
writing to the Amoro table include: none and hash.
[...]
| write.distribution.hash-mode | auto | String
| No
| The hash strategy for
writing to an Amoro table only takes effect when write.distribution-mode=hash.
The available options are: primary-key, partition-key, primary-partition-key,
and auto. primary-key: Shuffle by primary key partition-key: Shuffle by
partition key primary-partitio [...]
| properties.pulsar.admin.adminUrl | (none) | String
| If the LogStore is Pulsar and it is required for querying, it must be filled
in, otherwise it can be left empty.<img width=100/> | The HTTP URL for Pulsar
Admin is in the format: http://my-broker.example.com:8080.
[...]
-| properties.* | (none) | String
| No
| Parameters for Logstore:
For Logstore with Kafka ('log-store.type'='kafka' default value), all other
parameters supported by the Kafka Consumer can be set by prefixing properties.
to the parameter name, for example, 'properties.batch.size'='16384'. The
complete parameter informati [...]
+| properties.* | (none) | String
| No
| Parameters for Logstore:
For Logstore with Kafka ('log-store.type'='kafka' default value), all other
parameters supported by the Kafka Consumer can be set by prefixing properties.
to the parameter name, for example, 'properties.batch.size'='16384'. The
complete parameter informati [...]
| other table parameters | (none) | String
| No
| All parameters of an Amoro
table can be dynamically modified through SQL Hints, but they only take effect
for this specific task. For the specific parameter list, please refer to the
[Table Configuration](../configurations/). For permissions-related
configurations on the catalog, [...]
## Lookup join with SQL
diff --git a/docs/engines/flink/flink-ds.md b/docs/engines/flink/flink-ds.md
index 56e0c1dbd..9ddc5d7e9 100644
--- a/docs/engines/flink/flink-ds.md
+++ b/docs/engines/flink/flink-ds.md
@@ -33,7 +33,7 @@ To add a dependency on Mixed-format flink connector in Maven,
add the following
...
<dependency>
<groupId>org.apache.amoro</groupId>
- <!-- For example: amoro-format-mixed-flink-runtime-1.17 -->
+ <!-- For example: amoro-format-mixed-flink-runtime-1.18 -->
<artifactId>amoro-format-mixed-flink-runtime-${flink.minor-version}</artifactId>
<!-- For example: 0.7.0-incubating -->
<version>${amoro-format-mixed-flink.version}</version>
diff --git a/docs/engines/flink/flink-get-started.md
b/docs/engines/flink/flink-get-started.md
index a29511830..86588caa5 100644
--- a/docs/engines/flink/flink-get-started.md
+++ b/docs/engines/flink/flink-get-started.md
@@ -54,17 +54,17 @@ Flink Connector includes:
The Amoro project can be self-compiled to obtain the runtime jar.
-`./mvnw clean package -pl ':amoro-mixed-flink-runtime-1.17' -am -DskipTests`
+`./mvnw clean package -pl ':amoro-mixed-flink-runtime-1.18' -am -DskipTests`
-The Flink Runtime Jar is located in the
`amoro-format-mixed/amoro-mixed-flink/v1.17/amoro-mixed-flink-runtime-1.17/target`
directory.
+The Flink Runtime Jar is located in the
`amoro-format-mixed/amoro-mixed-flink/v1.18/amoro-mixed-flink-runtime-1.18/target`
directory.
## Environment preparation
-Download Flink and related dependencies, and download Flink 1.17/1.18/1.19 as
needed. Taking Flink 1.17 as an example:
+Download Flink and related dependencies, and download Flink 1.18/1.19 as
needed. Taking Flink 1.18 as an example:
```shell
# Replace version value with the latest Amoro version if needed
-AMORO_VERSION=0.8.0-incubating
-FLINK_VERSION=1.17.2
-FLINK_MAJOR_VERSION=1.17
+AMORO_VERSION=0.9.0-incubating
+FLINK_VERSION=1.18.1
+FLINK_MAJOR_VERSION=1.18
FLINK_HADOOP_SHADE_VERSION=2.7.5
APACHE_FLINK_URL=archive.apache.org/dist/flink
MAVEN_URL=https://repo1.maven.org/maven2
@@ -90,7 +90,7 @@ mv
amoro-mixed-format-flink-runtime-${FLINK_MAJOR_VERSION}-${AMORO_VERSION}.jar
Modify Flink related configuration files:
```shell
-cd flink-1.17.2
+cd flink-1.18.1
vim conf/flink-conf.yaml
```
Modify the following settings:
@@ -128,6 +128,6 @@ You need to enable Flink checkpoint and modify the [Flink
checkpoint configurati
The query results obtained through Flink SQL-Client cannot provide MOR
semantics based on primary keys. If you need to obtain merged results through
Flink engine queries, you can write the content of Amoro tables to a MySQL
table through JDBC connector for viewing.
-**3. When writing to Amoro tables with write.upsert feature enabled through
SQL-Client under Flink 1.17, there are still duplicate primary key data**
+**3. When writing to Amoro tables with write.upsert feature enabled through
SQL-Client under Flink 1.18, there are still duplicate primary key data**
You need to execute the command `set table.exec.sink.upsert-materialize =
none` in SQL-Client to turn off the upsert materialize operator generated
upsert view. This operator will affect the AmoroWriter's generation of delete
data when the write.upsert feature is enabled, causing duplicate primary key
data to not be merged.
diff --git a/docs/engines/flink/using-logstore.md
b/docs/engines/flink/using-logstore.md
index 0ecd50095..32280653b 100644
--- a/docs/engines/flink/using-logstore.md
+++ b/docs/engines/flink/using-logstore.md
@@ -42,7 +42,6 @@ Users can enable LogStore by configuring the following
parameters when creating
| Flink | Kafka |
|------------|----------|
-| Flink 1.17 | ✔ |
| Flink 1.18 | ✔ |
| Flink 1.19 | ✔ |
@@ -50,7 +49,6 @@ Kafka as LogStore Version Description:
| Flink Version | Kafka Versions |
|---------------| ----------------- |
-| 1.17.x | 0.10.2.\*<br> 0.11.\*<br> 1.\*<br> 2.\*<br> 3.\* |
| 1.18.x | 0.10.2.\*<br> 0.11.\*<br> 1.\*<br> 2.\*<br> 3.\* |
| 1.19.x | 0.10.2.\*<br> 0.11.\*<br> 1.\*<br> 2.\*<br> 3.\* |
diff --git a/docs/user-guides/configurations.md
b/docs/user-guides/configurations.md
index 72af2605a..843823f2f 100644
--- a/docs/user-guides/configurations.md
+++ b/docs/user-guides/configurations.md
@@ -138,7 +138,7 @@ If using Iceberg Format,please refer to [Iceberg
configurations](https://icebe
| log-store.address | NULL | Address of LogStore,
required when LogStore enabled. For Kafka, this is the Kafka bootstrap servers.
For Pulsar, this is the Pulsar Service URL, such as 'pulsar://localhost:6650'
[...]
| log-store.topic | NULL | Topic of LogStore,
required when LogStore enabled
[...]
| properties.pulsar.admin.adminUrl | NULL | HTTP URL of Pulsar
admin, such as 'http://my-broker.example.com:8080'. Only required when
log-store.type=pulsar
[...]
-| properties.XXX | NULL | Other configurations of
LogStore. <br><br>For Kafka, all the configurations supported by Kafka
Consumer/Producer can be set by prefixing them with `properties.`,<br>such as
`'properties.batch.size'='16384'`,<br>refer to [Kafka Consumer
Configurations](https://kafka.apache.org/documentation/#consumerconfigs),
[Kafka Producer
Configurations](https://kafka.apache.org/documentation/#producerconfigs) for
more details.<br><br> For Pulsar,al [...]
+| properties.XXX | NULL | Other configurations of
LogStore. <br><br>For Kafka, all the configurations supported by Kafka
Consumer/Producer can be set by prefixing them with `properties.`,<br>such as
`'properties.batch.size'='16384'`,<br>refer to [Kafka Consumer
Configurations](https://kafka.apache.org/documentation/#consumerconfigs),
[Kafka Producer
Configurations](https://kafka.apache.org/documentation/#producerconfigs) for
more details.<br><br> For Pulsar,al [...]
### Watermark configurations