This is an automated email from the ASF dual-hosted git repository.
czy006 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/amoro.git
The following commit(s) were added to refs/heads/master by this push:
new 60fd89300 [AMORO-4128] Drop mixed Flink 1.16 support (#4132)
60fd89300 is described below
commit 60fd89300a90ebd8fa7ccf09780a2faf47821537
Author: Jiwon Park <[email protected]>
AuthorDate: Thu Mar 19 14:22:23 2026 +0900
[AMORO-4128] Drop mixed Flink 1.16 support (#4132)
* Drop mixed Flink 1.16 support (#4128)
Remove mixed Flink 1.16 modules and update documentation to reflect
the supported Flink versions (1.17, 1.18). This unblocks future
Iceberg upgrades that no longer support Flink 1.16.
- Remove v1.16/amoro-mixed-flink-1.16 and
v1.16/amoro-mixed-flink-runtime-1.16 modules
- Remove Flink 1.16 entries from parent pom.xml module list
- Update version matrices in README and docs
- Update Flink documentation links from 1.16 to 1.17
- Clean up stale Flink 1.15 references in docs
Signed-off-by: Jiwon Park <[email protected]>
* Address review comment: sync MixedCatalog method with Iceberg 1.6.x source
Signed-off-by: Jiwon Park <[email protected]>
* Fix stale Spark 3.2 path reference in deployment docs
Update the example path from the non-existent
amoro-format-mixed-spark/v3.2 to amoro-mixed-spark/v3.3 to match
the current project structure. Also remove a trailing parenthesis
typo in the jar filename comment.
Signed-off-by: Jiwon Park <[email protected]>
---------
Signed-off-by: Jiwon Park <[email protected]>
---
README.md | 2 +-
.../apache/amoro/flink/catalog/MixedCatalog.java | 10 +-
amoro-format-mixed/amoro-mixed-flink/pom.xml | 2 -
.../v1.16/amoro-mixed-flink-1.16/pom.xml | 100 ---------
.../v1.16/amoro-mixed-flink-runtime-1.16/pom.xml | 229 ---------------------
docs/_index.md | 2 +-
docs/admin-guides/deployment.md | 10 +-
docs/engines/flink/flink-dml.md | 6 +-
docs/engines/flink/flink-ds.md | 2 +-
docs/engines/flink/flink-get-started.md | 16 +-
docs/engines/flink/using-logstore.md | 8 +-
docs/user-guides/configurations.md | 2 +-
12 files changed, 31 insertions(+), 358 deletions(-)
diff --git a/README.md b/README.md
index d4bdf344f..d3c3e8783 100644
--- a/README.md
+++ b/README.md
@@ -82,7 +82,7 @@ Amoro support multiple processing engines for Mixed format as
below:
| Processing Engine | Version | Batch Read | Batch Write |
Batch Overwrite | Streaming Read | Streaming Write | Create Table | Alter Table
|
|-------------------|------------------------|-------------|-------------|-----------------|----------------|-----------------|--------------|-------------|
-| Flink | 1.16.x, 1.17.x, 1.18.x | ✔ | ✔ |
✖ | ✔ | ✔ | ✔ | ✖
|
+| Flink | 1.17.x, 1.18.x | ✔ | ✔ |
✖ | ✔ | ✔ | ✔ | ✖
|
| Spark | 3.3, 3.4, 3.5 | ✔ | ✔ |
✔ | ✖ | ✖ | ✔ | ✔
|
| Hive | 2.x, 3.x | ✔ | ✖ |
✔ | ✖ | ✖ | ✖ | ✔ |
| Trino | 406 | ✔ | ✖ |
✔ | ✖ | ✖ | ✖ | ✔ |
diff --git
a/amoro-format-mixed/amoro-mixed-flink/amoro-mixed-flink-common/src/main/java/org/apache/amoro/flink/catalog/MixedCatalog.java
b/amoro-format-mixed/amoro-mixed-flink/amoro-mixed-flink-common/src/main/java/org/apache/amoro/flink/catalog/MixedCatalog.java
index 3026f608b..f4f4bb697 100644
---
a/amoro-format-mixed/amoro-mixed-flink/amoro-mixed-flink-common/src/main/java/org/apache/amoro/flink/catalog/MixedCatalog.java
+++
b/amoro-format-mixed/amoro-mixed-flink/amoro-mixed-flink-common/src/main/java/org/apache/amoro/flink/catalog/MixedCatalog.java
@@ -665,7 +665,7 @@ public class MixedCatalog extends AbstractCatalog {
/**
* copy from
- *
https://github.com/apache/iceberg/blob/main/flink/v1.16/flink/src/main/java/org/apache/iceberg/flink/FlinkCatalog.java#L425C23-L425C54
+ *
https://github.com/apache/iceberg/blob/1.6.x/flink/v1.17/flink/src/main/java/org/apache/iceberg/flink/FlinkCatalog.java#L425-L448
*
* @param ct1 CatalogTable before
* @param ct2 CatalogTable after
@@ -687,9 +687,15 @@ public class MixedCatalog extends AbstractCatalog {
if (!(Objects.equal(ts1.getTableColumns(), ts2.getTableColumns())
&& Objects.equal(ts1.getWatermarkSpecs(), ts2.getWatermarkSpecs())
&& equalsPrimary)) {
- throw new UnsupportedOperationException("Altering schema is not
supported yet.");
+ throw new UnsupportedOperationException(
+ "Altering schema is not supported in the old alterTable API. "
+ + "To alter schema, use the other alterTable API and provide a
list of TableChange's.");
}
+ validateTablePartition(ct1, ct2);
+ }
+
+ private static void validateTablePartition(CatalogTable ct1, CatalogTable
ct2) {
if (!ct1.getPartitionKeys().equals(ct2.getPartitionKeys())) {
throw new UnsupportedOperationException("Altering partition keys is not
supported yet.");
}
diff --git a/amoro-format-mixed/amoro-mixed-flink/pom.xml
b/amoro-format-mixed/amoro-mixed-flink/pom.xml
index 0c0654b8c..28773c0b1 100644
--- a/amoro-format-mixed/amoro-mixed-flink/pom.xml
+++ b/amoro-format-mixed/amoro-mixed-flink/pom.xml
@@ -36,8 +36,6 @@
<module>amoro-mixed-flink-common</module>
<module>amoro-mixed-flink-common-format</module>
<module>amoro-mixed-flink-common-iceberg-bridge</module>
- <module>v1.16/amoro-mixed-flink-1.16</module>
- <module>v1.16/amoro-mixed-flink-runtime-1.16</module>
<module>v1.17/amoro-mixed-flink-1.17</module>
<module>v1.17/amoro-mixed-flink-runtime-1.17</module>
<module>v1.18/amoro-mixed-flink-1.18</module>
diff --git
a/amoro-format-mixed/amoro-mixed-flink/v1.16/amoro-mixed-flink-1.16/pom.xml
b/amoro-format-mixed/amoro-mixed-flink/v1.16/amoro-mixed-flink-1.16/pom.xml
deleted file mode 100644
index f8235e5a6..000000000
--- a/amoro-format-mixed/amoro-mixed-flink/v1.16/amoro-mixed-flink-1.16/pom.xml
+++ /dev/null
@@ -1,100 +0,0 @@
-<?xml version="1.0" encoding="UTF-8"?>
-<!--
- ~ Licensed to the Apache Software Foundation (ASF) under one
- ~ or more contributor license agreements. See the NOTICE file
- ~ distributed with this work for additional information
- ~ regarding copyright ownership. The ASF licenses this file
- ~ to you under the Apache License, Version 2.0 (the
- ~ "License"); you may not use this file except in compliance
- ~ with the License. You may obtain a copy of the License at
- ~
- ~ http://www.apache.org/licenses/LICENSE-2.0
- ~
- ~ Unless required by applicable law or agreed to in writing, software
- ~ distributed under the License is distributed on an "AS IS" BASIS,
- ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- ~ See the License for the specific language governing permissions and
- ~ limitations under the License.
- -->
-<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
- xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
- <modelVersion>4.0.0</modelVersion>
- <parent>
- <groupId>org.apache.amoro</groupId>
- <artifactId>amoro-mixed-flink</artifactId>
- <version>0.9-SNAPSHOT</version>
- <relativePath>../../pom.xml</relativePath>
- </parent>
-
- <artifactId>amoro-format-mixed-flink-1.16</artifactId>
-
- <packaging>jar</packaging>
- <name>Amoro Project Mixed Format Flink 1.16</name>
- <url>https://amoro.apache.org</url>
-
- <properties>
- <iceberg.version>1.4.3</iceberg.version>
- <kafka.version>3.2.3</kafka.version>
- <assertj.version>3.21.0</assertj.version>
- <flink.version>1.16.3</flink.version>
- </properties>
-
- <dependencies>
- <dependency>
- <groupId>org.apache.amoro</groupId>
- <artifactId>amoro-mixed-flink-common</artifactId>
- <version>${project.parent.version}</version>
- </dependency>
-
- <dependency>
- <groupId>org.apache.iceberg</groupId>
- <artifactId>iceberg-flink-1.16</artifactId>
- <version>${iceberg.version}</version>
- <exclusions>
- <exclusion>
- <groupId>org.slf4j</groupId>
- <artifactId>slf4j-api</artifactId>
- </exclusion>
- <exclusion>
- <groupId>org.apache.parquet</groupId>
- <artifactId>parquet-column</artifactId>
- </exclusion>
- <exclusion>
- <groupId>org.apache.parquet</groupId>
- <artifactId>parquet-avro</artifactId>
- </exclusion>
- </exclusions>
- </dependency>
-
- <dependency>
- <groupId>org.apache.paimon</groupId>
- <artifactId>paimon-flink-1.16</artifactId>
- <version>${paimon.version}</version>
- </dependency>
- </dependencies>
-
- <build>
- <plugins>
- <plugin>
- <groupId>org.apache.maven.plugins</groupId>
- <artifactId>maven-shade-plugin</artifactId>
- <executions>
- <execution>
- <id>shade-amoro</id>
- <goals>
- <goal>shade</goal>
- </goals>
- <phase>package</phase>
- <configuration>
- <artifactSet>
- <includes combine.children="append">
-
<include>org.apache.amoro:amoro-format-mixed-flink-common</include>
- </includes>
- </artifactSet>
- </configuration>
- </execution>
- </executions>
- </plugin>
- </plugins>
- </build>
-</project>
diff --git
a/amoro-format-mixed/amoro-mixed-flink/v1.16/amoro-mixed-flink-runtime-1.16/pom.xml
b/amoro-format-mixed/amoro-mixed-flink/v1.16/amoro-mixed-flink-runtime-1.16/pom.xml
deleted file mode 100644
index a841bbad2..000000000
---
a/amoro-format-mixed/amoro-mixed-flink/v1.16/amoro-mixed-flink-runtime-1.16/pom.xml
+++ /dev/null
@@ -1,229 +0,0 @@
-<?xml version="1.0" encoding="UTF-8"?>
-<!--
- ~ Licensed to the Apache Software Foundation (ASF) under one
- ~ or more contributor license agreements. See the NOTICE file
- ~ distributed with this work for additional information
- ~ regarding copyright ownership. The ASF licenses this file
- ~ to you under the Apache License, Version 2.0 (the
- ~ "License"); you may not use this file except in compliance
- ~ with the License. You may obtain a copy of the License at
- ~
- ~ http://www.apache.org/licenses/LICENSE-2.0
- ~
- ~ Unless required by applicable law or agreed to in writing, software
- ~ distributed under the License is distributed on an "AS IS" BASIS,
- ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- ~ See the License for the specific language governing permissions and
- ~ limitations under the License.
- -->
-<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
- xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
- <modelVersion>4.0.0</modelVersion>
- <parent>
- <groupId>org.apache.amoro</groupId>
- <artifactId>amoro-mixed-flink</artifactId>
- <version>0.9-SNAPSHOT</version>
- <relativePath>../../pom.xml</relativePath>
- </parent>
-
- <artifactId>amoro-format-mixed-flink-runtime-1.16</artifactId>
- <name>Amoro Project Mixed Format Flink 1.16 Runtime</name>
- <url>https://amoro.apache.org</url>
-
- <properties>
- <iceberg.version>1.4.3</iceberg.version>
- <flink.version>1.16.3</flink.version>
- </properties>
-
- <dependencies>
- <dependency>
- <groupId>org.apache.amoro</groupId>
- <artifactId>amoro-format-mixed-flink-1.16</artifactId>
- <version>${project.parent.version}</version>
- </dependency>
-
- <dependency>
- <groupId>org.apache.flink</groupId>
- <artifactId>flink-connector-kafka</artifactId>
- <version>${flink.version}</version>
- <exclusions>
- <exclusion>
- <groupId>com.github.luben</groupId>
- <artifactId>zstd-jni</artifactId>
- </exclusion>
- </exclusions>
- </dependency>
-
- <dependency>
- <groupId>org.apache.kafka</groupId>
- <artifactId>kafka-clients</artifactId>
- <version>${kafka.version}</version>
- <exclusions>
- <exclusion>
- <groupId>com.github.luben</groupId>
- <artifactId>zstd-jni</artifactId>
- </exclusion>
- </exclusions>
- </dependency>
- </dependencies>
-
- <build>
- <plugins>
- <plugin>
- <groupId>org.apache.maven.plugins</groupId>
- <artifactId>maven-shade-plugin</artifactId>
- <executions>
- <execution>
- <id>shade-amoro</id>
- <goals>
- <goal>shade</goal>
- </goals>
- <phase>package</phase>
- <configuration>
-
<createDependencyReducedPom>false</createDependencyReducedPom>
- <artifactSet>
- <includes combine.children="append">
- <include>org.apache.amoro:*</include>
- <include>org.apache.iceberg:*</include>
-
<include>com.fasterxml.jackson.core:*</include>
- <include>org.apache.parquet:*</include>
- <include>org.apache.commons:*</include>
- <include>commons-lang:*</include>
-
<include>com.github.ben-manes.caffeine:*</include>
- <include>org.apache.avro:*</include>
- <include>org.apache.orc:*</include>
- <include>io.airlift:*</include>
- <include>commons-collections:*</include>
- <include>cglib:*</include>
- <include>com.google.guava:*</include>
- <include>asm:*</include>
-
<include>org.apache.httpcomponents.client5:*</include>
-
<include>org.apache.httpcomponents.core5:*</include>
-
<include>org.apache.flink:flink-connector-kafka</include>
- <include>org.apache.kafka:*</include>
- <include>com.github.luben:*</include>
- </includes>
- </artifactSet>
-
- <relocations>
- <relocation>
- <pattern>org.apache.iceberg</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.iceberg</shadedPattern>
- <excludes>
-
<exclude>org.apache.iceberg.mr.hive.*</exclude>
- </excludes>
- </relocation>
-
- <relocation>
- <pattern>org.apache.parquet</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.parquet</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>org.apache.commons</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.commons</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>org.apache.avro</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.avro</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>org.apache.orc</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.orc</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>org.apache.hc</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.hc</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>org.apache.jute</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.jute</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>org.apache.kafka</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.kafka</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>shaded.parquet</pattern>
-
<shadedPattern>org.apache.amoro.shade.shaded.parquet</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>com.fasterxml</pattern>
-
<shadedPattern>org.apache.amoro.shade.com.fasterxml</shadedPattern>
- </relocation>
-
- <relocation>
-
<pattern>com.github.benmanes.caffeine</pattern>
-
<shadedPattern>org.apache.amoro.shade.com.github.benmanes.caffeine</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>org.threeten.extra</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.threeten.extra</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>net.sf.cglib</pattern>
-
<shadedPattern>org.apache.amoro.shade.net.sf.cglib</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>com.google</pattern>
-
<shadedPattern>org.apache.amoro.shade.com.google</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>org.objectweb.asm</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.objectweb.asm</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>com.facebook.fb303</pattern>
-
<shadedPattern>org.apache.amoro.shade.com.facebook.fb303</shadedPattern>
- </relocation>
-
- <relocation>
- <pattern>io.airlift</pattern>
-
<shadedPattern>org.apache.amoro.shade.io.airlift</shadedPattern>
- </relocation>
-
- <!-- flink-sql-connector-kafka -->
- <relocation>
-
<pattern>org.apache.flink.connector.kafka</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.flink.connector.kafka</shadedPattern>
- </relocation>
- <relocation>
-
<pattern>org.apache.flink.streaming.connectors.kafka</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.flink.streaming.connectors.kafka</shadedPattern>
- </relocation>
- <relocation>
-
<pattern>org.apache.flink.streaming.util.serialization.JSONKeyValueDeserializationSchema</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.flink.streaming.util.serialization.JSONKeyValueDeserializationSchema</shadedPattern>
- </relocation>
- <relocation>
-
<pattern>org.apache.flink.streaming.util.serialization.KeyedDeserializationSchema</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.flink.streaming.util.serialization.KeyedDeserializationSchema</shadedPattern>
- </relocation>
- <relocation>
-
<pattern>org.apache.flink.streaming.util.serialization.KeyedSerializationSchema</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.flink.streaming.util.serialization.KeyedSerializationSchema</shadedPattern>
- </relocation>
- <relocation>
-
<pattern>org.apache.flink.streaming.util.serialization.TypeInformationKeyValueSerializationSchema</pattern>
-
<shadedPattern>org.apache.amoro.shade.org.apache.flink.streaming.util.serialization.TypeInformationKeyValueSerializationSchema</shadedPattern>
- </relocation>
- </relocations>
- </configuration>
- </execution>
- </executions>
- </plugin>
- </plugins>
- </build>
-</project>
diff --git a/docs/_index.md b/docs/_index.md
index 2418178ed..063721585 100644
--- a/docs/_index.md
+++ b/docs/_index.md
@@ -69,7 +69,7 @@ Amoro support multiple processing engines for Mixed format as
below:
| Processing Engine | Version | Batch Read | Batch Write |
Batch Overwrite | Streaming Read | Streaming Write | Create Table | Alter Table
|
|-------------------|------------------------|-------------|-------------|-----------------|----------------|-----------------|--------------|-------------|
-| Flink | 1.16.x, 1.17.x, 1.18.x | ✔ | ✔ |
✖ | ✔ | ✔ | ✔ | ✖
|
+| Flink | 1.17.x, 1.18.x | ✔ | ✔ |
✖ | ✔ | ✔ | ✔ | ✖
|
| Spark | 3.3, 3.4, 3.5 | ✔ | ✔ |
✔ | ✖ | ✖ | ✔ | ✔
|
| Hive | 2.x, 3.x | ✔ | ✖ |
✔ | ✖ | ✖ | ✖ | ✔ |
| Trino | 406 | ✔ | ✖ |
✔ | ✖ | ✖ | ✖ | ✔ |
diff --git a/docs/admin-guides/deployment.md b/docs/admin-guides/deployment.md
index 9f9c806a7..5a185c01d 100644
--- a/docs/admin-guides/deployment.md
+++ b/docs/admin-guides/deployment.md
@@ -53,13 +53,13 @@ $ cd dist/target/
$ ls
amoro-x.y.z-bin.zip # AMS release package
-$ cd
${base_dir}/amoro-format-mixed/amoro-format-mixed-flink/v1.15/amoro-format-mixed-flink-runtime-1.15/target
-$ ls
-amoro-format-mixed-flink-runtime-1.15-x.y.z.jar # Flink 1.15 runtime package
+$ cd
${base_dir}/amoro-format-mixed/amoro-mixed-flink/v1.17/amoro-mixed-flink-runtime-1.17/target
+$ ls
+amoro-format-mixed-flink-runtime-1.17-x.y.z.jar # Flink 1.17 runtime package
-$ cd
${base_dir}/amoro-format-mixed/amoro-format-mixed-spark/v3.2/amoro-format-mixed-spark-runtime-3.2/target
+$ cd
${base_dir}/amoro-format-mixed/amoro-mixed-spark/v3.3/amoro-mixed-spark-runtime-3.3/target
$ ls
-amoro-format-mixed-spark-runtime-3.2-x.y.z.jar # Spark v3.2 runtime package)
+amoro-format-mixed-spark-runtime-3.3-x.y.z.jar # Spark v3.3 runtime package
```
More build guide can be found in the project's
[README](https://github.com/apache/amoro?tab=readme-ov-file#building).
diff --git a/docs/engines/flink/flink-dml.md b/docs/engines/flink/flink-dml.md
index e84425971..c226704a3 100644
--- a/docs/engines/flink/flink-dml.md
+++ b/docs/engines/flink/flink-dml.md
@@ -108,13 +108,13 @@ The following Hint Options are supported:
| scan.startup.specific-offsets | (none) | String | No
| specify offsets for each
partition in case of 'specific-offsets' startup mode, e.g.
'partition:0,offset:42;partition:1,offset:300'.
[...]
| properties.group.id | (none) | String | If the
LogStore for an Amoro table is Kafka, it is mandatory to provide its details
while querying the table. Otherwise, it can be left empty. | The group id used
to read the Kafka Topic
[...]
| properties.pulsar.admin.adminUrl | (none) | String | Required if
LogStore is pulsar, otherwise not required
| Pulsar admin's HTTP URL,
e.g. http://my-broker.example.com:8080
[...]
-| properties.* | (none) | String | No
| Parameters for Logstore:
<br>For Logstore with Kafka ('log-store.type'='kafka' default value), all other
parameters supported by the Kafka Consumer can be set by prefixing properties.
to the parameter name, for example, 'properties.batch.size'='16384'. The
complete parameter inform [...]
+| properties.* | (none) | String | No
| Parameters for Logstore:
<br>For Logstore with Kafka ('log-store.type'='kafka' default value), all other
parameters supported by the Kafka Consumer can be set by prefixing properties.
to the parameter name, for example, 'properties.batch.size'='16384'. The
complete parameter inform [...]
| log.consumer.changelog.modes | all-kinds | String | No
| The type of RowKind that
will be generated when reading log data, supports: all-kinds,
append-only.<br>all-kinds: will read cdc data, including
+I/-D/-U/+U;<br>append-only: will only generate Insert data, recommended to use
this configuration when reading without primary key. [...]
> **Notes**
>
> - When log-store.type = pulsar, the parallelism of the Flink task cannot be
> less than the number of partitions in the Pulsar topic, otherwise some
> partition data cannot be read.
-> - When the number of topic partitions in log-store is less than the
parallelism of the Flink task, some Flink subtasks will be idle. At this time,
if the task has a watermark, the parameter table.exec.source.idle-timeout must
be configured, otherwise the watermark will not advance. See [official
documentation](https://nightlies.apache.org/flink/flink-docs-release-1.16/docs/dev/table/config/#table-exec-source-idle-timeout)
for details.
+> - When the number of topic partitions in log-store is less than the
parallelism of the Flink task, some Flink subtasks will be idle. At this time,
if the task has a watermark, the parameter table.exec.source.idle-timeout must
be configured, otherwise the watermark will not advance. See [official
documentation](https://nightlies.apache.org/flink/flink-docs-release-1.17/docs/dev/table/config/#table-exec-source-idle-timeout)
for details.
### Streaming mode (FileStore non-primary key table)
@@ -204,7 +204,7 @@ Hint Options
| write.distribution-mode | hash | String
| No
| The distribution modes for
writing to the Amoro table include: none and hash.
[...]
| write.distribution.hash-mode | auto | String
| No
| The hash strategy for
writing to an Amoro table only takes effect when write.distribution-mode=hash.
The available options are: primary-key, partition-key, primary-partition-key,
and auto. primary-key: Shuffle by primary key partition-key: Shuffle by
partition key primary-partitio [...]
| properties.pulsar.admin.adminUrl | (none) | String
| If the LogStore is Pulsar and it is required for querying, it must be filled
in, otherwise it can be left empty.<img width=100/> | The HTTP URL for Pulsar
Admin is in the format: http://my-broker.example.com:8080.
[...]
-| properties.* | (none) | String
| No
| Parameters for Logstore:
For Logstore with Kafka ('log-store.type'='kafka' default value), all other
parameters supported by the Kafka Consumer can be set by prefixing properties.
to the parameter name, for example, 'properties.batch.size'='16384'. The
complete parameter informati [...]
+| properties.* | (none) | String
| No
| Parameters for Logstore:
For Logstore with Kafka ('log-store.type'='kafka' default value), all other
parameters supported by the Kafka Consumer can be set by prefixing properties.
to the parameter name, for example, 'properties.batch.size'='16384'. The
complete parameter informati [...]
| other table parameters | (none) | String
| No
| All parameters of an Amoro
table can be dynamically modified through SQL Hints, but they only take effect
for this specific task. For the specific parameter list, please refer to the
[Table Configuration](../configurations/). For permissions-related
configurations on the catalog, [...]
## Lookup join with SQL
diff --git a/docs/engines/flink/flink-ds.md b/docs/engines/flink/flink-ds.md
index de3af0202..56e0c1dbd 100644
--- a/docs/engines/flink/flink-ds.md
+++ b/docs/engines/flink/flink-ds.md
@@ -33,7 +33,7 @@ To add a dependency on Mixed-format flink connector in Maven,
add the following
...
<dependency>
<groupId>org.apache.amoro</groupId>
- <!-- For example: amoro-format-mixed-flink-runtime-1.15 -->
+ <!-- For example: amoro-format-mixed-flink-runtime-1.17 -->
<artifactId>amoro-format-mixed-flink-runtime-${flink.minor-version}</artifactId>
<!-- For example: 0.7.0-incubating -->
<version>${amoro-format-mixed-flink.version}</version>
diff --git a/docs/engines/flink/flink-get-started.md
b/docs/engines/flink/flink-get-started.md
index 0d233e2d9..0c6e1d3cd 100644
--- a/docs/engines/flink/flink-get-started.md
+++ b/docs/engines/flink/flink-get-started.md
@@ -54,17 +54,17 @@ Flink Connector includes:
The Amoro project can be self-compiled to obtain the runtime jar.
-`./mvnw clean package -pl ':amoro-mixed-flink-runtime-1.15' -am -DskipTests`
+`./mvnw clean package -pl ':amoro-mixed-flink-runtime-1.17' -am -DskipTests`
-The Flink Runtime Jar is located in the
`amoro-format-mixed/amoro-format-mixed-flink/v1.15/amoro-format-mixed-flink-runtime-1.15/target`
directory.
+The Flink Runtime Jar is located in the
`amoro-format-mixed/amoro-mixed-flink/v1.17/amoro-mixed-flink-runtime-1.17/target`
directory.
## Environment preparation
-Download Flink and related dependencies, and download Flink 1.15/1.16/1.17 as
needed. Taking Flink 1.15 as an example:
+Download Flink and related dependencies, and download Flink 1.17/1.18 as
needed. Taking Flink 1.17 as an example:
```shell
# Replace version value with the latest Amoro version if needed
AMORO_VERSION=0.8.0-incubating
-FLINK_VERSION=1.15.3
-FLINK_MAJOR_VERSION=1.15
+FLINK_VERSION=1.17.2
+FLINK_MAJOR_VERSION=1.17
FLINK_HADOOP_SHADE_VERSION=2.7.5
APACHE_FLINK_URL=archive.apache.org/dist/flink
MAVEN_URL=https://repo1.maven.org/maven2
@@ -90,7 +90,7 @@ mv
amoro-mixed-format-flink-runtime-${FLINK_MAJOR_VERSION}-${AMORO_VERSION}.jar
Modify Flink related configuration files:
```shell
-cd flink-1.15.3
+cd flink-1.17.2
vim conf/flink-conf.yaml
```
Modify the following settings:
@@ -128,6 +128,6 @@ You need to enable Flink checkpoint and modify the [Flink
checkpoint configurati
The query results obtained through Flink SQL-Client cannot provide MOR
semantics based on primary keys. If you need to obtain merged results through
Flink engine queries, you can write the content of Amoro tables to a MySQL
table through JDBC connector for viewing.
-**3. When writing to Amoro tables with write.upsert feature enabled through
SQL-Client under Flink 1.15, there are still duplicate primary key data**
+**3. When writing to Amoro tables with write.upsert feature enabled through
SQL-Client under Flink 1.17, there are still duplicate primary key data**
-You need to execute the command `set table.exec.sink.upsert-materialize =
none` in SQL-Client to turn off the upsert materialize operator generated
upsert view. This operator will affect the AmoroWriter's generation of delete
data when the write.upsert feature is enabled, causing duplicate primary key
data to not be merged.
\ No newline at end of file
+You need to execute the command `set table.exec.sink.upsert-materialize =
none` in SQL-Client to turn off the upsert materialize operator generated
upsert view. This operator will affect the AmoroWriter's generation of delete
data when the write.upsert feature is enabled, causing duplicate primary key
data to not be merged.
diff --git a/docs/engines/flink/using-logstore.md
b/docs/engines/flink/using-logstore.md
index 44db2f1fb..30803dc93 100644
--- a/docs/engines/flink/using-logstore.md
+++ b/docs/engines/flink/using-logstore.md
@@ -42,17 +42,15 @@ Users can enable LogStore by configuring the following
parameters when creating
| Flink | Kafka |
|------------|----------|
-| Flink 1.15 | ✔ |
-| Flink 1.16 | ✔ |
| Flink 1.17 | ✔ |
+| Flink 1.18 | ✔ |
Kafka as LogStore Version Description:
| Flink Version | Kafka Versions |
|---------------| ----------------- |
-| 1.15.x | 0.10.2.\*<br> 0.11.\*<br> 1.\*<br> 2.\*<br> 3.\*
|
-| 1.16.x | 0.10.2.\*<br> 0.11.\*<br> 1.\*<br> 2.\*<br> 3.\*
|
-| 1.17.x | 0.10.2.\*<br> 0.11.\*<br> 1.\*<br> 2.\*<br> 3.\*
|
+| 1.17.x | 0.10.2.\*<br> 0.11.\*<br> 1.\*<br> 2.\*<br> 3.\* |
+| 1.18.x | 0.10.2.\*<br> 0.11.\*<br> 1.\*<br> 2.\*<br> 3.\* |
diff --git a/docs/user-guides/configurations.md
b/docs/user-guides/configurations.md
index e25c52811..72af2605a 100644
--- a/docs/user-guides/configurations.md
+++ b/docs/user-guides/configurations.md
@@ -138,7 +138,7 @@ If using Iceberg Format,please refer to [Iceberg
configurations](https://icebe
| log-store.address | NULL | Address of LogStore,
required when LogStore enabled. For Kafka, this is the Kafka bootstrap servers.
For Pulsar, this is the Pulsar Service URL, such as 'pulsar://localhost:6650'
[...]
| log-store.topic | NULL | Topic of LogStore,
required when LogStore enabled
[...]
| properties.pulsar.admin.adminUrl | NULL | HTTP URL of Pulsar
admin, such as 'http://my-broker.example.com:8080'. Only required when
log-store.type=pulsar
[...]
-| properties.XXX | NULL | Other configurations of
LogStore. <br><br>For Kafka, all the configurations supported by Kafka
Consumer/Producer can be set by prefixing them with `properties.`,<br>such as
`'properties.batch.size'='16384'`,<br>refer to [Kafka Consumer
Configurations](https://kafka.apache.org/documentation/#consumerconfigs),
[Kafka Producer
Configurations](https://kafka.apache.org/documentation/#producerconfigs) for
more details.<br><br> For Pulsar,al [...]
+| properties.XXX | NULL | Other configurations of
LogStore. <br><br>For Kafka, all the configurations supported by Kafka
Consumer/Producer can be set by prefixing them with `properties.`,<br>such as
`'properties.batch.size'='16384'`,<br>refer to [Kafka Consumer
Configurations](https://kafka.apache.org/documentation/#consumerconfigs),
[Kafka Producer
Configurations](https://kafka.apache.org/documentation/#producerconfigs) for
more details.<br><br> For Pulsar,al [...]
### Watermark configurations