(flink-web) branch asf-site updated: Rebuild website

2024-07-10 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git


The following commit(s) were added to refs/heads/asf-site by this push:
 new 6ba59c924 Rebuild website
6ba59c924 is described below

commit 6ba59c92473f2bd2c151bdd8b2167e301be0ae67
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Wed Jul 10 11:15:20 2024 +0200

Rebuild website
---
 .../code-style-and-quality-common/index.html   | 28 +++---
 .../code-style-and-quality-components/index.html   | 28 +++---
 .../code-style-and-quality-formatting/index.html   | 28 +++---
 .../code-style-and-quality-java/index.html | 28 +++---
 .../code-style-and-quality-preamble/index.html | 28 +++---
 .../index.html | 28 +++---
 .../code-style-and-quality-scala/index.html| 28 +++---
 .../code-style-and-quality-common/index.html   | 28 +++---
 .../code-style-and-quality-components/index.html   | 28 +++---
 .../code-style-and-quality-formatting/index.html   | 28 +++---
 .../code-style-and-quality-java/index.html | 28 +++---
 .../code-style-and-quality-preamble/index.html | 28 +++---
 .../index.html | 28 +++---
 .../code-style-and-quality-scala/index.html| 28 +++---
 14 files changed, 196 insertions(+), 196 deletions(-)

diff --git a/content/how-to-contribute/code-style-and-quality-common/index.html 
b/content/how-to-contribute/code-style-and-quality-common/index.html
index 65ef35b70..f553d9869 100644
--- a/content/how-to-contribute/code-style-and-quality-common/index.html
+++ b/content/how-to-contribute/code-style-and-quality-common/index.html
@@ -472,33 +472,33 @@ https://github.com/alex-shpak/hugo-book
   Code Style and Quality Guide — Common Rules
   #
 
-
+
   Preamble
-  #
+  #
 
-
+
   Pull 
Requests  Changes
-  #
+  #
 
-
+
   Common Coding 
Guide
-  #
+  #
 
-
+
   Java Language 
Guide
-  #
+  #
 
-
+
   Scala Language 
Guide
-  #
+  #
 
-
+
   Components 
Guide
-  #
+  #
 
-
+
   Formatting 
Guide
-  #
+  #
 
 
 
diff --git 
a/content/how-to-contribute/code-style-and-quality-components/index.html 
b/content/how-to-contribute/code-style-and-quality-components/index.html
index a866267e2..684ac1a23 100644
--- a/content/how-to-contribute/code-style-and-quality-components/index.html
+++ b/content/how-to-contribute/code-style-and-quality-components/index.html
@@ -474,33 +474,33 @@ https://github.com/alex-shpak/hugo-book
   Code Style and Quality Guide — Components Guide
   #
 
-
+
   Preamble
-  #
+  #
 
-
+
   Pull 
Requests  Changes
-  #
+  #
 
-
+
   Common Coding 
Guide
-  #
+  #
 
-
+
   Java Language 
Guide
-  #
+  #
 
-
+
   Scala Language 
Guide
-  #
+  #
 
-
+
   Components 
Guide
-  #
+  #
 
-
+
   Formatting 
Guide
-  #
+  #
 
 
   Component Specific Guidelines
diff --git 
a/content/how-to-contribute/code-style-and-quality-formatting/index.html 
b/content/how-to-contribute/code-style-and-quality-formatting/index.html
index 1564bd742..289860eb5 100644
--- a/content/how-to-contribute/code-style-and-quality-formatting/index.html
+++ b/content/how-to-contribute/code-style-and-quality-formatting/index.html
@@ -472,33 +472,33 @@ https://github.com/alex-shpak/hugo-book
   Code Style and Quality Guide — Formatting Guide
   #
 
-
+
   Preamble
-  #
+  #
 
-
+
   Pull 
Requests  Changes
-  #
+  #
 
-
+
   Common Coding 
Guide
-  #
+  #
 
-
+
   Java Language 
Guide
-  #
+  #
 
-
+
   Scala Language 
Guide
-  #
+  #
 
-
+
   Components 
Guide
-  #
+  #
 
-
+
   Formatting 
Guide
-  #
+  #
 
 
   Java Code Formatting Style
diff --git a/content/how-to-contribute/code-style-and-quality-java/index.html 
b/content/how-to-contribute/code-style-and-quality-java/index.html
index 60c179cdd..f3f0cdc3e 100644
--- a/content/how-to-contribute/code-style-and-quality-java/index.html
+++ b/content/how-to-contribute/code-style-and-quality-java/index.html
@@ -470,33 +470,33 @@ https://github.com/alex-shpak/hugo-book
   Code Style and Quality Guide — Java
   #
 
-
+
   Preamble
-  #
+  #
 
-
+
   Pull 
Requests  Changes
-  #
+  #
 
-
+
   Common Coding 
Guide
-  #
+  #
 
-
+
   Java Language 
Guide
-  #
+  #
 
-
+
   Scala Language 
Guide
-  #
+  #
 
-
+
   Components 
Guide
-  #
+  #
 
-
+
   Formatting 
Guide
-  #
+  #
 
 
   Java Language Features and Libraries
diff --git 
a/content/how-to-contribute/code-style-and-quality-preamble/index.html 
b/content/how-to-contribute/code-style-and-quality-preamble/index.html
index d9f9f6b60..76a63395d 100644
--- a/content/how-to-contribute/code-style-and-quality-preamble/index.html
+++ b/content/how-to-contribute/code-style-and-q

(flink) branch master updated: [FLINK-34268] Removing exclusions for FLINK-33676 33805

2024-07-04 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new cc19159b489 [FLINK-34268] Removing exclusions for FLINK-33676 33805
cc19159b489 is described below

commit cc19159b489dafad71c01bf551bc224a38203eae
Author: Jim Hughes 
AuthorDate: Wed Jul 3 08:54:16 2024 -0400

[FLINK-34268] Removing exclusions for FLINK-33676 33805

When FLINK-34268 was initially implemented, FLINK-33676 and FLINK-33805 had 
not landed.

Given that, there were exclusions hard-coded.  This PR removes those.
---
 .../plan/nodes/exec/testutils/RestoreTestCompleteness.java   | 12 
 1 file changed, 12 deletions(-)

diff --git 
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/testutils/RestoreTestCompleteness.java
 
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/testutils/RestoreTestCompleteness.java
index ca2d28b7166..092b37d7753 100644
--- 
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/testutils/RestoreTestCompleteness.java
+++ 
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/testutils/RestoreTestCompleteness.java
@@ -19,16 +19,12 @@
 package org.apache.flink.table.planner.plan.nodes.exec.testutils;
 
 import org.apache.flink.table.planner.plan.nodes.exec.ExecNode;
-import 
org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecGlobalWindowAggregate;
-import 
org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecLocalWindowAggregate;
-import 
org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecOverAggregate;
 import 
org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecPythonCalc;
 import 
org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecPythonCorrelate;
 import 
org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecPythonGroupAggregate;
 import 
org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecPythonGroupTableAggregate;
 import 
org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecPythonGroupWindowAggregate;
 import 
org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecPythonOverAggregate;
-import 
org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecWindowAggregate;
 import org.apache.flink.table.planner.plan.utils.ExecNodeMetadataUtil;
 import 
org.apache.flink.table.planner.plan.utils.ExecNodeMetadataUtil.ExecNodeNameVersion;
 
@@ -52,14 +48,6 @@ public class RestoreTestCompleteness {
 private static final Set>> SKIP_EXEC_NODES =
 new HashSet>>() {
 {
-/** TODO: Remove after FLINK-33676 is merged. */
-add(StreamExecWindowAggregate.class);
-add(StreamExecLocalWindowAggregate.class);
-add(StreamExecGlobalWindowAggregate.class);
-
-/** TODO: Remove after FLINK-33805 is merged. */
-add(StreamExecOverAggregate.class);
-
 /** Ignoring python based exec nodes temporarily. */
 add(StreamExecPythonCalc.class);
 add(StreamExecPythonCorrelate.class);



(flink) branch master updated (14fdd13b632 -> f0b01277dd2)

2024-06-13 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


from 14fdd13b632 [FLINK-25537] [JUnit5 Migration] Module: flink-core 
with,Package: core (#24881)
 add f0b01277dd2 [FLINK-35378][FLIP-453] Promote Unified Sink API V2 to 
Public and Deprecate SinkFunction. This closes #24805

No new revisions were added by this update.

Summary of changes:
 .../18509c9e-3250-4c52-91b9-11ccefc85db1 |  9 +
 .../e5126cae-f3fe-48aa-b6fb-60ae6cc3fcd5 | 16 
 .../org/apache/flink/api/connector/sink2/Committer.java  |  6 +++---
 .../flink/api/connector/sink2/CommitterInitContext.java  |  4 ++--
 .../flink/api/connector/sink2/CommittingSinkWriter.java  |  4 ++--
 .../java/org/apache/flink/api/connector/sink2/Sink.java  |  3 ++-
 .../org/apache/flink/api/connector/sink2/SinkWriter.java |  6 +++---
 .../flink/api/connector/sink2/StatefulSinkWriter.java|  4 ++--
 .../flink/api/connector/sink2/SupportsCommitter.java |  4 ++--
 .../flink/api/connector/sink2/SupportsWriterState.java   |  3 ++-
 .../flink/api/connector/sink2/WriterInitContext.java |  6 ++
 .../streaming/api/functions/sink/DiscardingSink.java |  3 +++
 .../streaming/api/functions/sink/PrintSinkFunction.java  |  3 +++
 .../streaming/api/functions/sink/RichSinkFunction.java   |  8 +++-
 .../flink/streaming/api/functions/sink/SinkFunction.java |  3 +++
 .../streaming/api/functions/sink/SocketClientSink.java   |  3 +++
 .../api/functions/sink/TwoPhaseCommitSinkFunction.java   |  3 +++
 pom.xml  |  5 +
 18 files changed, 64 insertions(+), 29 deletions(-)



(flink-docker) branch master updated: [FLINK-34746] Switching to the Apache CDN for Dockerfile

2024-05-24 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-docker.git


The following commit(s) were added to refs/heads/master by this push:
 new 8836007  [FLINK-34746] Switching to the Apache CDN for Dockerfile
8836007 is described below

commit 883600747505c128d97e9d25c9326f0c6f1d31e4
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Thu Mar 21 09:58:26 2024 +

[FLINK-34746] Switching to the Apache CDN for Dockerfile
---
 1.17/scala_2.12-java11-ubuntu/Dockerfile | 2 +-
 1.17/scala_2.12-java8-ubuntu/Dockerfile  | 2 +-
 1.18/scala_2.12-java11-ubuntu/Dockerfile | 2 +-
 1.18/scala_2.12-java17-ubuntu/Dockerfile | 2 +-
 1.18/scala_2.12-java8-ubuntu/Dockerfile  | 2 +-
 1.19/scala_2.12-java11-ubuntu/Dockerfile | 2 +-
 1.19/scala_2.12-java17-ubuntu/Dockerfile | 2 +-
 1.19/scala_2.12-java8-ubuntu/Dockerfile  | 2 +-
 8 files changed, 8 insertions(+), 8 deletions(-)

diff --git a/1.17/scala_2.12-java11-ubuntu/Dockerfile 
b/1.17/scala_2.12-java11-ubuntu/Dockerfile
index b4ac0b2..602555b 100644
--- a/1.17/scala_2.12-java11-ubuntu/Dockerfile
+++ b/1.17/scala_2.12-java11-ubuntu/Dockerfile
@@ -44,7 +44,7 @@ RUN set -ex; \
   gosu nobody true
 
 # Configure Flink version
-ENV 
FLINK_TGZ_URL=https://www.apache.org/dyn/closer.cgi?action=download=flink/flink-1.17.2/flink-1.17.2-bin-scala_2.12.tgz
 \
+ENV 
FLINK_TGZ_URL=https://dlcdn.apache.org/flink/flink-1.17.2/flink-1.17.2-bin-scala_2.12.tgz
 \
 
FLINK_ASC_URL=https://downloads.apache.org/flink/flink-1.17.2/flink-1.17.2-bin-scala_2.12.tgz.asc
 \
 GPG_KEY=2E0E1AB5D39D55E608071FB9F795C02A4D2482B3 \
 CHECK_GPG=true
diff --git a/1.17/scala_2.12-java8-ubuntu/Dockerfile 
b/1.17/scala_2.12-java8-ubuntu/Dockerfile
index f3f3c18..60dc4d3 100644
--- a/1.17/scala_2.12-java8-ubuntu/Dockerfile
+++ b/1.17/scala_2.12-java8-ubuntu/Dockerfile
@@ -44,7 +44,7 @@ RUN set -ex; \
   gosu nobody true
 
 # Configure Flink version
-ENV 
FLINK_TGZ_URL=https://www.apache.org/dyn/closer.cgi?action=download=flink/flink-1.17.2/flink-1.17.2-bin-scala_2.12.tgz
 \
+ENV 
FLINK_TGZ_URL=https://dlcdn.apache.org/flink/flink-1.17.2/flink-1.17.2-bin-scala_2.12.tgz
 \
 
FLINK_ASC_URL=https://downloads.apache.org/flink/flink-1.17.2/flink-1.17.2-bin-scala_2.12.tgz.asc
 \
 GPG_KEY=2E0E1AB5D39D55E608071FB9F795C02A4D2482B3 \
 CHECK_GPG=true
diff --git a/1.18/scala_2.12-java11-ubuntu/Dockerfile 
b/1.18/scala_2.12-java11-ubuntu/Dockerfile
index 2dcbd27..be6e28d 100644
--- a/1.18/scala_2.12-java11-ubuntu/Dockerfile
+++ b/1.18/scala_2.12-java11-ubuntu/Dockerfile
@@ -44,7 +44,7 @@ RUN set -ex; \
   gosu nobody true
 
 # Configure Flink version
-ENV 
FLINK_TGZ_URL=https://www.apache.org/dyn/closer.cgi?action=download=flink/flink-1.18.1/flink-1.18.1-bin-scala_2.12.tgz
 \
+ENV 
FLINK_TGZ_URL=https://dlcdn.apache.org/flink/flink-1.18.1/flink-1.18.1-bin-scala_2.12.tgz
 \
 
FLINK_ASC_URL=https://downloads.apache.org/flink/flink-1.18.1/flink-1.18.1-bin-scala_2.12.tgz.asc
 \
 GPG_KEY=96AE0E32CBE6E0753CE6DF6CB078D1D3253A8D82 \
 CHECK_GPG=true
diff --git a/1.18/scala_2.12-java17-ubuntu/Dockerfile 
b/1.18/scala_2.12-java17-ubuntu/Dockerfile
index 27f23fb..1a5fd05 100644
--- a/1.18/scala_2.12-java17-ubuntu/Dockerfile
+++ b/1.18/scala_2.12-java17-ubuntu/Dockerfile
@@ -44,7 +44,7 @@ RUN set -ex; \
   gosu nobody true
 
 # Configure Flink version
-ENV 
FLINK_TGZ_URL=https://www.apache.org/dyn/closer.cgi?action=download=flink/flink-1.18.1/flink-1.18.1-bin-scala_2.12.tgz
 \
+ENV 
FLINK_TGZ_URL=https://dlcdn.apache.org/flink/flink-1.18.1/flink-1.18.1-bin-scala_2.12.tgz
 \
 
FLINK_ASC_URL=https://downloads.apache.org/flink/flink-1.18.1/flink-1.18.1-bin-scala_2.12.tgz.asc
 \
 GPG_KEY=96AE0E32CBE6E0753CE6DF6CB078D1D3253A8D82 \
 CHECK_GPG=true
diff --git a/1.18/scala_2.12-java8-ubuntu/Dockerfile 
b/1.18/scala_2.12-java8-ubuntu/Dockerfile
index ed727e9..3b26678 100644
--- a/1.18/scala_2.12-java8-ubuntu/Dockerfile
+++ b/1.18/scala_2.12-java8-ubuntu/Dockerfile
@@ -44,7 +44,7 @@ RUN set -ex; \
   gosu nobody true
 
 # Configure Flink version
-ENV 
FLINK_TGZ_URL=https://www.apache.org/dyn/closer.cgi?action=download=flink/flink-1.18.1/flink-1.18.1-bin-scala_2.12.tgz
 \
+ENV 
FLINK_TGZ_URL=https://dlcdn.apache.org/flink/flink-1.18.1/flink-1.18.1-bin-scala_2.12.tgz
 \
 
FLINK_ASC_URL=https://downloads.apache.org/flink/flink-1.18.1/flink-1.18.1-bin-scala_2.12.tgz.asc
 \
 GPG_KEY=96AE0E32CBE6E0753CE6DF6CB078D1D3253A8D82 \
 CHECK_GPG=true
diff --git a/1.19/scala_2.12-java11-ubuntu/Dockerfile 
b/1.19/scala_2.12-java11-ubuntu/Dockerfile
index b8e946d..fdcf9a3 100644
--- a/1.19/scala_2.12-java11-ubuntu/Dockerfile
+++ b/1.19/scala_2.12-java11-ubuntu/Dockerfile
@@ -44,7 +44,7 @@ RUN set -ex; \
   gosu nobody true
 
 # Configure Flink version
-ENV 
FLINK_TGZ_URL=https://www.apache.org/dyn/closer.cgi?action=download=flink/flink-1.19.0

(flink-connector-kudu) branch main updated: [hotfix] Remove non-existing branches for Weekly tests

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git


The following commit(s) were added to refs/heads/main by this push:
 new 2b82430  [hotfix] Remove non-existing branches for Weekly tests
2b82430 is described below

commit 2b82430280d347193d8ba26a844a3c4344365964
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Tue May 14 09:09:38 2024 +0200

[hotfix] Remove non-existing branches for Weekly tests
---
 .github/workflows/weekly.yml | 18 +++---
 1 file changed, 3 insertions(+), 15 deletions(-)

diff --git a/.github/workflows/weekly.yml b/.github/workflows/weekly.yml
index f3210c0..7b87f4d 100644
--- a/.github/workflows/weekly.yml
+++ b/.github/workflows/weekly.yml
@@ -30,9 +30,6 @@ jobs:
 strategy:
   matrix:
 flink_branches: [{
-  flink: 1.17-SNAPSHOT,
-  branch: main
-}, {
   flink: 1.18-SNAPSHOT,
   jdk: '8, 11, 17',
   branch: main
@@ -41,18 +38,9 @@ jobs:
   jdk: '8, 11, 17, 21',
   branch: main
 }, {
-  flink: 1.17.2,
-  branch: v3.1
-}, {
-  flink: 1.18.1,
-  jdk: '8, 11, 17',
-  branch: v3.1
-}, {
-  flink: 1.17.2,
-  branch: v3.0
-}, {
-  flink: 1.18.1,
-  branch: v3.0
+  flink: 1.20-SNAPSHOT,
+  jdk: '8, 11, 17, 21',
+  branch: main
 }]
 uses: apache/flink-connector-shared-utils/.github/workflows/ci.yml@ci_utils
 with:



(flink-connector-kudu) branch main updated: [FLINK-34961] Use dedicated CI name for JDBC connector to differentiate it in infra-reports

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git


The following commit(s) were added to refs/heads/main by this push:
 new 4098566  [FLINK-34961] Use dedicated CI name for JDBC connector to 
differentiate it in infra-reports
4098566 is described below

commit 409856698df1bf630c58f6e761d45e4dc8f06ad5
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Tue May 14 09:04:00 2024 +0200

[FLINK-34961] Use dedicated CI name for JDBC connector to differentiate it 
in infra-reports
---
 .github/workflows/push_pr.yml | 5 -
 .github/workflows/weekly.yml  | 5 -
 2 files changed, 8 insertions(+), 2 deletions(-)

diff --git a/.github/workflows/push_pr.yml b/.github/workflows/push_pr.yml
index 72b98a2..da2f077 100644
--- a/.github/workflows/push_pr.yml
+++ b/.github/workflows/push_pr.yml
@@ -16,7 +16,10 @@
 # limitations under the License.
 

 
-name: CI
+# We need to specify repo related information here since Apache INFRA doesn't 
differentiate
+# between several workflows with the same names while preparing a report for 
GHA usage
+# https://infra-reports.apache.org/#ghactions
+name: Flink Connector Kudu CI
 on: [push, pull_request]
 concurrency:
   group: ${{ github.workflow }}-${{ github.ref }}
diff --git a/.github/workflows/weekly.yml b/.github/workflows/weekly.yml
index aaa729f..f3210c0 100644
--- a/.github/workflows/weekly.yml
+++ b/.github/workflows/weekly.yml
@@ -16,7 +16,10 @@
 # limitations under the License.
 

 
-name: Nightly
+# We need to specify repo related information here since Apache INFRA doesn't 
differentiate
+# between several workflows with the same names while preparing a report for 
GHA usage
+# https://infra-reports.apache.org/#ghactions
+name: Weekly Flink Connector Kudu
 on:
   schedule:
 - cron: "0 0 * * 0"



(flink-connector-kudu) 40/44: [FLINK-34930] Fix KuduTableSourceITCase

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit abc4a3b965e54406eaddcf06261116916440c4c5
Author: Ferenc Csaky 
AuthorDate: Thu Mar 28 15:05:46 2024 +0100

[FLINK-34930] Fix KuduTableSourceITCase
---
 .../connector/kudu/table/KuduTableSourceITCase.java  | 20 
 1 file changed, 16 insertions(+), 4 deletions(-)

diff --git 
a/flink-connector-kudu/src/test/java/org/apache/flink/connector/kudu/table/KuduTableSourceITCase.java
 
b/flink-connector-kudu/src/test/java/org/apache/flink/connector/kudu/table/KuduTableSourceITCase.java
index 2468e77..7971bda 100644
--- 
a/flink-connector-kudu/src/test/java/org/apache/flink/connector/kudu/table/KuduTableSourceITCase.java
+++ 
b/flink-connector-kudu/src/test/java/org/apache/flink/connector/kudu/table/KuduTableSourceITCase.java
@@ -23,6 +23,7 @@ import org.apache.flink.table.api.TableEnvironment;
 import org.apache.flink.types.Row;
 import org.apache.flink.util.CloseableIterator;
 
+import org.junit.jupiter.api.AfterEach;
 import org.junit.jupiter.api.BeforeEach;
 import org.junit.jupiter.api.Test;
 
@@ -36,9 +37,11 @@ public class KuduTableSourceITCase extends KuduTestBase {
 private TableEnvironment tableEnv;
 private KuduCatalog catalog;
 
+private KuduTableInfo tableInfo = null;
+
 @BeforeEach
-public void init() {
-KuduTableInfo tableInfo = booksTableInfo("books", true);
+void init() {
+tableInfo = booksTableInfo("books", true);
 setUpDatabase(tableInfo);
 tableEnv = 
KuduTableTestUtils.createTableEnvWithBlinkPlannerBatchMode();
 catalog = new KuduCatalog(getMasterAddress());
@@ -46,6 +49,14 @@ public class KuduTableSourceITCase extends KuduTestBase {
 tableEnv.useCatalog("kudu");
 }
 
+@AfterEach
+void cleanup() {
+if (tableInfo != null) {
+cleanDatabase(tableInfo);
+tableInfo = null;
+}
+}
+
 @Test
 void testFullBatchScan() throws Exception {
 CloseableIterator it =
@@ -53,7 +64,8 @@ public class KuduTableSourceITCase extends KuduTestBase {
 List results = new ArrayList<>();
 it.forEachRemaining(results::add);
 assertEquals(5, results.size());
-assertEquals("1001,Java for dummies,Tan Ah Teck,11.11,11", 
results.get(0).toString());
+assertEquals(
+"+I[1001, Java for dummies, Tan Ah Teck, 11.11, 11]", 
results.get(0).toString());
 tableEnv.executeSql("DROP TABLE books");
 }
 
@@ -68,7 +80,7 @@ public class KuduTableSourceITCase extends KuduTestBase {
 List results = new ArrayList<>();
 it.forEachRemaining(results::add);
 assertEquals(1, results.size());
-assertEquals("More Java for more dummies", results.get(0).toString());
+assertEquals("+I[More Java for more dummies]", 
results.get(0).toString());
 tableEnv.executeSql("DROP TABLE books");
 }
 }



(flink-connector-kudu) 25/44: BAHIR-296: Unify mockito version to 1.10.19

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 1dce82988ac057da3ca3ee5dcb67df227cd86284
Author: Joao Boto 
AuthorDate: Mon Dec 27 16:36:32 2021 +0100

BAHIR-296: Unify mockito version to 1.10.19
---
 flink-connector-kudu/pom.xml | 5 +
 1 file changed, 1 insertion(+), 4 deletions(-)

diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index 12fd9da..750929c 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -31,7 +31,6 @@
 
   
 1.13.0
-1.10.19
   
 
   
@@ -97,8 +96,6 @@
 
   org.mockito
   mockito-all
-  ${mockito.version}
-  test
 
 
 
@@ -119,7 +116,7 @@
   ${log4j.version}
   test
 
-
+  
 
   
 



(flink-connector-kudu) 41/44: [FLINK-34930] State Bahir fork

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 9253b9d7b59abd34c7f0c87e6ad5673984ed03c9
Author: Ferenc Csaky 
AuthorDate: Tue Apr 2 16:57:16 2024 +0200

[FLINK-34930] State Bahir fork
---
 README.md | 6 +-
 1 file changed, 5 insertions(+), 1 deletion(-)

diff --git a/README.md b/README.md
index 1944eca..477aa84 100644
--- a/README.md
+++ b/README.md
@@ -2,6 +2,10 @@
 
 This repository contains the official Apache Flink Kudu connector.
 
+## Forked from Bahir
+
+The connector code is forked from [Apache Bahir](https://bahir.apache.org/) 
project after its retirement. 
+
 ## Apache Flink
 
 Apache Flink is an open source stream processing framework with powerful 
stream- and batch-processing capabilities.
@@ -65,4 +69,4 @@ This article describes [how to contribute to Apache 
Flink](https://flink.apache.
 ## About
 
 Apache Flink is an open source project of The Apache Software Foundation (ASF).
-The Apache Flink project originated from the 
[Stratosphere](http://stratosphere.eu) research project.
\ No newline at end of file
+The Apache Flink project originated from the 
[Stratosphere](http://stratosphere.eu) research project.



(flink-connector-kudu) 42/44: [FLINK-34930] Skip spotless for JDK 21+

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit c7dfb67801ced03fe028e5f738fc2ced670440b4
Author: Ferenc Csaky 
AuthorDate: Mon Apr 8 13:40:05 2024 +0200

[FLINK-34930] Skip spotless for JDK 21+
---
 pom.xml | 25 +
 1 file changed, 25 insertions(+)

diff --git a/pom.xml b/pom.xml
index 6d8617a..69e2d74 100644
--- a/pom.xml
+++ b/pom.xml
@@ -321,4 +321,29 @@ under the License.



+
+   
+   
+   java21
+   
+   [21,)
+   
+   
+   
+   
+   
+   
com.diffplug.spotless
+   
spotless-maven-plugin
+   
+   
+   
true
+   
+   
+   
+   
+   
+   
+   
 



(flink-connector-kudu) 17/44: Add batch table env support and filter push down to Kudu connector (#82)

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 6e8b66f16b25802f5111f7b70a20e7d7a1ca5d65
Author: Sebastian Liu 
AuthorDate: Tue Jul 14 11:25:44 2020 +0800

Add batch table env support and filter push down to Kudu connector (#82)

Update the KuduTableSource to inherit from InputFormatTableSource
in order to support both streaming SQL and Batch SQL at the same time.
In order to reduce unnecessary data transmission, the filter push down
was also added to the KuduTableSource.
---
 flink-connector-kudu/README.md |   2 +-
 flink-connector-kudu/pom.xml   |   2 +-
 .../connectors/kudu/connector/KuduFilterInfo.java  |  14 +-
 .../flink/connectors/kudu/table/KuduCatalog.java   |   5 +-
 .../connectors/kudu/table/KuduCatalogFactory.java  |   4 +-
 .../connectors/kudu/table/KuduTableFactory.java|   2 +-
 .../connectors/kudu/table/KuduTableSource.java |  96 +--
 .../kudu/table/utils/KuduTableUtils.java   | 142 
 .../connectors/kudu/table/KuduCatalogTest.java |   2 +-
 .../kudu/table/KuduTableFactoryTest.java   |   2 +-
 .../kudu/table/KuduTableSourceITCase.java  |  67 
 .../connectors/kudu/table/KuduTableSourceTest.java | 181 +
 .../connectors/kudu/table/KuduTableTestUtils.java  |  10 +-
 13 files changed, 496 insertions(+), 33 deletions(-)

diff --git a/flink-connector-kudu/README.md b/flink-connector-kudu/README.md
index 14c13eb..6370aa6 100644
--- a/flink-connector-kudu/README.md
+++ b/flink-connector-kudu/README.md
@@ -184,7 +184,7 @@ are described as being nullable, and not being primary keys.
 
 ## DataStream API
 
-It is also possible to use the the Kudu connector directly from the DataStream 
API however we
+It is also possible to use the Kudu connector directly from the DataStream API 
however we
 encourage all users to explore the Table API as it provides a lot of useful 
tooling when working
 with Kudu data.
 
diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index fe7887c..3bbefee 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -52,7 +52,7 @@
 
 
   org.apache.flink
-  flink-table-planner-blink_2.11
+  
flink-table-planner-blink_${scala.binary.version}
   ${flink.version}
   provided
 
diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduFilterInfo.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduFilterInfo.java
index 0a89cad..08fa86b 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduFilterInfo.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduFilterInfo.java
@@ -68,25 +68,25 @@ public class KuduFilterInfo implements Serializable {
 predicate = KuduPredicate.newComparisonPredicate(column, 
comparison, (String) this.value);
 break;
 case FLOAT:
-predicate = KuduPredicate.newComparisonPredicate(column, 
comparison, this.value);
+predicate = KuduPredicate.newComparisonPredicate(column, 
comparison, (float) this.value);
 break;
 case INT8:
-predicate = KuduPredicate.newComparisonPredicate(column, 
comparison, this.value);
+predicate = KuduPredicate.newComparisonPredicate(column, 
comparison, (byte) this.value);
 break;
 case INT16:
-predicate = KuduPredicate.newComparisonPredicate(column, 
comparison, this.value);
+predicate = KuduPredicate.newComparisonPredicate(column, 
comparison, (short) this.value);
 break;
 case INT32:
-predicate = KuduPredicate.newComparisonPredicate(column, 
comparison, this.value);
+predicate = KuduPredicate.newComparisonPredicate(column, 
comparison, (int) this.value);
 break;
 case INT64:
-predicate = KuduPredicate.newComparisonPredicate(column, 
comparison, this.value);
+predicate = KuduPredicate.newComparisonPredicate(column, 
comparison, (long) this.value);
 break;
 case DOUBLE:
-predicate = KuduPredicate.newComparisonPredicate(column, 
comparison, this.value);
+predicate = KuduPredicate.newComparisonPredicate(column, 
comparison, (double) this.value);
 break;
 case BOOL:
-predicate = KuduPredicate.newComparisonPredicate(column, 
comparison, this.value);
+predicate = KuduPredicate.newComparisonPredicate(column, 
comparison, (boolean) this.value);
 break;
 case UNIXTIME_MICROS

(flink-connector-kudu) 43/44: [FLINK-34930] Enable module opens for tests for newer JDKs

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 5230ec106551307d83d3998fbe0d08d35fc7d2f8
Author: Ferenc Csaky 
AuthorDate: Thu Apr 11 14:42:02 2024 +0200

[FLINK-34930] Enable module opens for tests for newer JDKs
---
 pom.xml | 4 
 1 file changed, 4 insertions(+)

diff --git a/pom.xml b/pom.xml
index 69e2d74..d1548f7 100644
--- a/pom.xml
+++ b/pom.xml
@@ -66,6 +66,10 @@ under the License.
 


flink-connector-kudu-parent
+   
+   --add-opens=java.base/java.lang=ALL-UNNAMED
+   --add-opens=java.base/java.util=ALL-UNNAMED
+   

 




(flink-connector-kudu) 44/44: [FLINK-34930] Migrate Bahir NOTICE

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 8674446dc0f1bab67528347ebfefc206bd380b6e
Author: Ferenc Csaky 
AuthorDate: Wed Apr 17 11:26:04 2024 +0200

[FLINK-34930] Migrate Bahir NOTICE
---
 NOTICE | 6 ++
 1 file changed, 2 insertions(+), 4 deletions(-)

diff --git a/NOTICE b/NOTICE
index c1e8320..c3ce37e 100644
--- a/NOTICE
+++ b/NOTICE
@@ -1,5 +1,5 @@
-Apache Flink Kudu Connector
-Copyright 2014-2024 The Apache Software Foundation
+Apache Bahir
+Copyright (c) 2016-2024 The Apache Software Foundation.
 
 This product includes software developed at
 The Apache Software Foundation (http://www.apache.org/).
@@ -12,5 +12,3 @@ ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO 
EVENT SHALL THE AUT
 DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING 
FROM LOSS OF USE, DATA OR PROFITS,
 WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING 
OUT OF OR IN CONNECTION WITH THE
 USE OR PERFORMANCE OF THIS SOFTWARE.
-
-



(flink-connector-kudu) 34/44: [BAHIR-308] Bump flink version to 1.15.3

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 6ff53e7c4d40f3986983439f35083407693026b5
Author: Joao Boto 
AuthorDate: Wed Mar 8 17:11:35 2023 +0100

[BAHIR-308] Bump flink version to 1.15.3
---
 flink-connector-kudu/pom.xml   | 16 ++---
 .../kudu/connector/ColumnSchemasFactory.java   |  1 -
 .../kudu/connector/CreateTableOptionsFactory.java  |  1 -
 .../connectors/kudu/connector/KuduFilterInfo.java  |  1 -
 .../connectors/kudu/connector/KuduTableInfo.java   |  3 +-
 .../convertor/RowResultRowDataConvertor.java   |  6 +---
 .../kudu/connector/failure/KuduFailureHandler.java |  1 -
 .../kudu/connector/reader/KuduReader.java  | 10 ++
 .../kudu/connector/reader/KuduReaderConfig.java|  3 +-
 .../writer/AbstractSingleOperationMapper.java  |  1 -
 .../kudu/connector/writer/KuduOperationMapper.java |  1 -
 .../kudu/connector/writer/KuduWriter.java  |  9 +
 .../kudu/connector/writer/KuduWriterConfig.java|  3 +-
 .../kudu/connector/writer/PojoOperationMapper.java |  6 +---
 .../connectors/kudu/format/KuduOutputFormat.java   |  1 -
 .../flink/connectors/kudu/streaming/KuduSink.java  |  1 -
 .../kudu/table/AbstractReadOnlyCatalog.java| 22 ++--
 .../flink/connectors/kudu/table/KuduCatalog.java   | 35 ---
 .../connectors/kudu/table/KuduTableFactory.java| 28 +++
 .../connectors/kudu/table/KuduTableSource.java | 13 ++-
 .../kudu/table/UpsertOperationMapper.java  |  1 -
 .../kudu/table/dynamic/KuduDynamicTableSource.java | 40 +++---
 .../table/dynamic/catalog/KuduDynamicCatalog.java  | 32 +++--
 .../kudu/table/utils/KuduTableUtils.java   | 12 ++-
 .../connectors/kudu/table/utils/KuduTypeUtils.java | 14 +---
 .../connectors/kudu/connector/KuduTestBase.java| 19 --
 .../kudu/format/KuduOutputFormatTest.java  |  3 +-
 .../connectors/kudu/streaming/KuduSinkTest.java|  1 -
 .../connectors/kudu/table/KuduCatalogTest.java |  1 -
 .../kudu/table/KuduTableFactoryTest.java   | 11 ++
 .../kudu/table/KuduTableSourceITCase.java  |  4 +--
 .../connectors/kudu/table/KuduTableSourceTest.java | 18 ++
 .../connectors/kudu/table/KuduTableTestUtils.java  |  4 +--
 .../kudu/writer/AbstractOperationTest.java |  8 +
 .../kudu/writer/PojoOperationMapperTest.java   |  3 +-
 .../kudu/writer/RowOperationMapperTest.java|  1 -
 .../kudu/writer/TupleOpertaionMapperTest.java  |  1 -
 37 files changed, 78 insertions(+), 257 deletions(-)

diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index 20b16b4..134d6f7 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -36,13 +36,6 @@
 
   
 
-  
-org.apache.kudu
-kudu-binary
-${kudu.version}
-${os.detected.classifier}
-test
-  
   
 org.apache.kudu
 kudu-client
@@ -85,11 +78,11 @@
   
 
   org.apache.flink
-  flink-clients_${scala.binary.version}
+  flink-clients
 
 
   org.apache.flink
-  flink-streaming-java_${scala.binary.version}
+  flink-streaming-java
 
 
   org.apache.flink
@@ -99,11 +92,6 @@
   org.apache.flink
   flink-table-planner_${scala.binary.version}
 
-
-  org.apache.kudu
-  kudu-binary
-  ${os.detected.classifier}
-
 
   org.apache.kudu
   kudu-client
diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/ColumnSchemasFactory.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/ColumnSchemasFactory.java
index b178308..4997938 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/ColumnSchemasFactory.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/ColumnSchemasFactory.java
@@ -18,7 +18,6 @@
 package org.apache.flink.connectors.kudu.connector;
 
 import org.apache.flink.annotation.PublicEvolving;
-
 import org.apache.kudu.ColumnSchema;
 
 import java.io.Serializable;
diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/CreateTableOptionsFactory.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/CreateTableOptionsFactory.java
index 4a475e9..fd9bfa4 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/CreateTableOptionsFactory.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/CreateTableOptionsFactory.java
@@ -18,7 +18,6 @@
 package org.apache.flink.connectors.kudu.connector;
 
 import org.apache.flink.annotation.PublicEvolving;
-
 import org.apache.kudu.client.CreateTableOptions

(flink-connector-kudu) 18/44: BAHIR-240: replace docker test by testcontainer (#89)

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 3f9db639b956585ba4e2a7bc99557646d1fd53ff
Author: Joao Boto 
AuthorDate: Tue Jul 28 17:41:33 2020 +0200

BAHIR-240: replace docker test by testcontainer (#89)
---
 flink-connector-kudu/pom.xml | 1 -
 1 file changed, 1 deletion(-)

diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index 3bbefee..bbf168e 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -32,7 +32,6 @@
   
 1.11.1
 1.10.19
-!DockerTest
   
 
   



(flink-connector-kudu) 06/44: [BAHIR-180] Improve eventual consistence for Kudu connector

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 247dfe99486f04622b4ee7d83fa51ef253aaa75d
Author: eskabetxe 
AuthorDate: Tue Jan 15 13:05:29 2019 +0100

[BAHIR-180] Improve eventual consistence for Kudu connector

Closes #40
---
 flink-connector-kudu/pom.xml   |   2 +-
 .../streaming/connectors/kudu/KuduInputFormat.java |  27 ++---
 .../connectors/kudu/KuduOutputFormat.java  |  36 +++---
 .../flink/streaming/connectors/kudu/KuduSink.java  |  34 +++---
 .../connectors/kudu/connector/KuduColumnInfo.java  |  50 
 .../connectors/kudu/connector/KuduConnector.java   |  81 -
 .../connectors/kudu/connector/KuduMapper.java  |  59 +
 .../connectors/kudu/connector/KuduRow.java |  72 ++-
 .../connectors/kudu/connector/KuduRowIterator.java |  57 +
 .../connectors/kudu/serde/DefaultSerDe.java|  39 ++
 .../connectors/kudu/serde/KuduDeserialization.java |  25 
 .../connectors/kudu/serde/KuduSerialization.java   |  28 +
 .../streaming/connectors/kudu/serde/PojoSerDe.java | 134 +
 .../connectors/kudu/KuduOuputFormatTest.java   |  11 +-
 .../streaming/connectors/kudu/KuduSinkTest.java|  11 +-
 .../connectors/kudu/connector/KuduDatabase.java|   2 +-
 .../connectors/kudu/serde/PojoSerDeTest.java   |  57 +
 17 files changed, 543 insertions(+), 182 deletions(-)

diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index 5540fc1..61ab4a6 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -30,7 +30,7 @@
   jar
 
   
-1.8.0
+1.7.1
 
 !DockerTest
   
diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/streaming/connectors/kudu/KuduInputFormat.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/streaming/connectors/kudu/KuduInputFormat.java
index 617e317..fd126d0 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/streaming/connectors/kudu/KuduInputFormat.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/streaming/connectors/kudu/KuduInputFormat.java
@@ -24,7 +24,9 @@ import org.apache.flink.core.io.InputSplitAssigner;
 import org.apache.flink.core.io.LocatableInputSplit;
 import org.apache.flink.streaming.connectors.kudu.connector.*;
 import org.apache.flink.util.Preconditions;
-import org.apache.kudu.client.*;
+import org.apache.kudu.client.KuduException;
+import org.apache.kudu.client.KuduScanToken;
+import org.apache.kudu.client.LocatedTablet;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
@@ -43,8 +45,7 @@ public class KuduInputFormat extends RichInputFormat implements 
OutputFormat {
+public class KuduOutputFormat extends RichOutputFormat {
 
 private static final Logger LOG = 
LoggerFactory.getLogger(KuduOutputFormat.class);
 
@@ -36,10 +37,12 @@ public class KuduOutputFormat 
implements OutputFormat
 private KuduConnector.Consistency consistency;
 private KuduConnector.WriteMode writeMode;
 
-private transient KuduConnector tableContext;
+private KuduSerialization serializer;
 
+private transient KuduConnector connector;
 
-public KuduOutputFormat(String kuduMasters, KuduTableInfo tableInfo) {
+
+public KuduOutputFormat(String kuduMasters, KuduTableInfo tableInfo, 
KuduSerialization serializer) {
 Preconditions.checkNotNull(kuduMasters,"kuduMasters could not be 
null");
 this.kuduMasters = kuduMasters;
 
@@ -47,8 +50,10 @@ public class KuduOutputFormat 
implements OutputFormat
 this.tableInfo = tableInfo;
 this.consistency = KuduConnector.Consistency.STRONG;
 this.writeMode = KuduConnector.WriteMode.UPSERT;
+this.serializer = serializer.withSchema(tableInfo.getSchema());
 }
 
+
 public KuduOutputFormat withEventualConsistency() {
 this.consistency = KuduConnector.Consistency.EVENTUAL;
 return this;
@@ -81,28 +86,31 @@ public class KuduOutputFormat 
implements OutputFormat
 
 @Override
 public void open(int taskNumber, int numTasks) throws IOException {
-startTableContext();
-}
-
-private void startTableContext() throws IOException {
-if (tableContext != null) return;
-tableContext = new KuduConnector(kuduMasters, tableInfo);
+if (connector != null) return;
+connector = new KuduConnector(kuduMasters, tableInfo, consistency, 
writeMode);
+serializer = serializer.withSchema(tableInfo.getSchema());
 }
 
 @Override
-public void writeRecord(OUT kuduRow) throws IOException {
+public void writeRecord(OUT row) throws IOException {
+boolean response;
 try {
-tableContext.writeRow(kuduRow, consistency, writeMode);
+KuduRow kuduRow = serializer.serialize(row);
+  

(flink-connector-kudu) 11/44: [BAHIR-207] Add tests for scala 2.12 on travis (#59)

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 47133a3e4873c0a4295ed4c002d805fbd3304408
Author: Joao Boto 
AuthorDate: Thu Jul 4 00:13:51 2019 +0200

[BAHIR-207] Add tests for scala 2.12 on travis (#59)
---
 .../apache/flink/streaming/connectors/kudu/KuduOuputFormatTest.java | 6 +++---
 .../org/apache/flink/streaming/connectors/kudu/KuduSinkTest.java| 6 +++---
 2 files changed, 6 insertions(+), 6 deletions(-)

diff --git 
a/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/KuduOuputFormatTest.java
 
b/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/KuduOuputFormatTest.java
index b9aaa40..4e91310 100644
--- 
a/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/KuduOuputFormatTest.java
+++ 
b/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/KuduOuputFormatTest.java
@@ -45,7 +45,7 @@ public class KuduOuputFormatTest extends KuduDatabase {
 public void testNotTableExist() throws IOException {
 String masterAddresses = harness.getMasterAddressesAsString();
 KuduTableInfo tableInfo = 
booksTableInfo(UUID.randomUUID().toString(),false);
-KuduOutputFormat outputFormat = new 
KuduOutputFormat<>(masterAddresses, tableInfo, new DefaultSerDe());
+KuduOutputFormat outputFormat = new 
KuduOutputFormat<>(masterAddresses, tableInfo, new DefaultSerDe());
 Assertions.assertThrows(UnsupportedOperationException.class, () -> 
outputFormat.open(0,1));
 }
 
@@ -54,7 +54,7 @@ public class KuduOuputFormatTest extends KuduDatabase {
 String masterAddresses = harness.getMasterAddressesAsString();
 
 KuduTableInfo tableInfo = 
booksTableInfo(UUID.randomUUID().toString(),true);
-KuduOutputFormat outputFormat = new 
KuduOutputFormat<>(masterAddresses, tableInfo, new DefaultSerDe())
+KuduOutputFormat outputFormat = new 
KuduOutputFormat<>(masterAddresses, tableInfo, new DefaultSerDe())
 .withStrongConsistency();
 outputFormat.open(0,1);
 
@@ -74,7 +74,7 @@ public class KuduOuputFormatTest extends KuduDatabase {
 String masterAddresses = harness.getMasterAddressesAsString();
 
 KuduTableInfo tableInfo = 
booksTableInfo(UUID.randomUUID().toString(),true);
-KuduOutputFormat outputFormat = new 
KuduOutputFormat<>(masterAddresses, tableInfo, new DefaultSerDe())
+KuduOutputFormat outputFormat = new 
KuduOutputFormat<>(masterAddresses, tableInfo, new DefaultSerDe())
 .withEventualConsistency();
 outputFormat.open(0,1);
 
diff --git 
a/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/KuduSinkTest.java
 
b/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/KuduSinkTest.java
index 83e060d..225bf7c 100644
--- 
a/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/KuduSinkTest.java
+++ 
b/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/KuduSinkTest.java
@@ -58,7 +58,7 @@ public class KuduSinkTest extends KuduDatabase {
 public void testNotTableExist() throws IOException {
 String masterAddresses = harness.getMasterAddressesAsString();
 KuduTableInfo tableInfo = 
booksTableInfo(UUID.randomUUID().toString(),false);
-KuduSink sink = new KuduSink<>(masterAddresses, tableInfo, new 
DefaultSerDe());
+KuduSink sink = new KuduSink<>(masterAddresses, tableInfo, 
new DefaultSerDe());
 sink.setRuntimeContext(context);
 Assertions.assertThrows(UnsupportedOperationException.class, () -> 
sink.open(new Configuration()));
 }
@@ -68,7 +68,7 @@ public class KuduSinkTest extends KuduDatabase {
 String masterAddresses = harness.getMasterAddressesAsString();
 
 KuduTableInfo tableInfo = 
booksTableInfo(UUID.randomUUID().toString(),true);
-KuduSink sink = new KuduSink<>(masterAddresses, tableInfo, new 
DefaultSerDe())
+KuduSink sink = new KuduSink<>(masterAddresses, tableInfo, 
new DefaultSerDe())
 .withStrongConsistency();
 sink.setRuntimeContext(context);
 sink.open(new Configuration());
@@ -88,7 +88,7 @@ public class KuduSinkTest extends KuduDatabase {
 String masterAddresses = harness.getMasterAddressesAsString();
 
 KuduTableInfo tableInfo = 
booksTableInfo(UUID.randomUUID().toString(),true);
-KuduSink sink = new KuduSink<>(masterAddresses, tableInfo, new 
DefaultSerDe())
+KuduSink sink = new KuduSink<>(masterAddresses, tableInfo, 
new DefaultSerDe())
 .withEventualConsistency();
 sink.setRuntimeContext(context);
 sink.open(new Configuration());



(flink-connector-kudu) 37/44: [FLINK-34930] Adapt POM files to the new structure, remove flink-shaded Guava usage

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 033554484423caa9ff1fbd830dc3b15b066b8463
Author: Ferenc Csaky 
AuthorDate: Thu Mar 28 12:30:27 2024 +0100

[FLINK-34930] Adapt POM files to the new structure, remove flink-shaded 
Guava usage
---
 flink-connector-kudu/pom.xml   | 237 ++-
 .../flink/connectors/kudu/table/KuduCatalog.java   |   3 +-
 .../connectors/kudu/table/KuduTableSource.java |   5 +-
 .../function/lookup/KuduRowDataLookupFunction.java |   4 +-
 .../kudu/table/utils/KuduTableUtils.java   |   3 +-
 pom.xml| 324 +
 6 files changed, 423 insertions(+), 153 deletions(-)

diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index 8dc4f88..22817ff 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -1,148 +1,97 @@
 
 
-http://maven.apache.org/POM/4.0.0; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; 
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd;>
-  4.0.0
-
-  
-org.apache.bahir
-bahir-flink-parent
-1.2-SNAPSHOT
-../pom.xml
-  
-
-  flink-connector-kudu
-  jar
-
-  flink-connector-kudu
-
-  
-1.13.0
-  
-
-  
-
-  
-org.apache.kudu
-kudu-client
-${kudu.version}
-  
-  
-org.apache.kudu
-kudu-test-utils
-${kudu.version}
-test
-  
-  
-org.apache.logging.log4j
-log4j-api
-${log4j.version}
-test
-  
-  
-org.apache.logging.log4j
-log4j-core
-${log4j.version}
-test
-  
-  
-org.apache.logging.log4j
-log4j-slf4j-impl
-${log4j.version}
-test
-  
-  
-  
-org.junit.jupiter
-junit-jupiter-migrationsupport
-${junit.jupiter.version}
-test
-  
-
-  
-
-  
-
-  org.apache.flink
-  flink-clients
-
-
-  org.apache.flink
-  flink-streaming-java
-
-
-  org.apache.flink
-  flink-table-api-java-bridge
-
-
-  org.apache.flink
-  flink-table-common
-
-
-  org.apache.flink
-  flink-table-planner-loader
-
-
-  org.apache.flink
-  flink-table-runtime
-
-
-  org.apache.kudu
-  kudu-client
-
-
-  org.apache.kudu
-  kudu-test-utils
-
-
-  org.apache.logging.log4j
-  log4j-api
-
-
-  org.apache.logging.log4j
-  log4j-core
-
-
-  org.apache.logging.log4j
-  log4j-slf4j-impl
-
-
-
-  org.junit.jupiter
-  junit-jupiter-migrationsupport
-
-
-  org.mockito
-  mockito-all
-
-
-  org.testcontainers
-  testcontainers
-
-  
-
-  
-
-  
-kr.motd.maven
-os-maven-plugin
-1.6.2
-  
-
-  
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
 
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+http://maven.apache.org/POM/4.0.0;
+xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;
+xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/maven-v4_0_0.xsd;>
+
+   4.0.0
+
+   
+   org.apache.flink
+   flink-connector-kudu-parent
+   2.0-SNAPSHOT
+   
+
+   flink-connector-kudu
+   Flink : Connectors : Kudu
+   jar
+
+   
+   
+   org.apache.flink
+   flink-clients
+   
+
+   
+   org.apache.flink
+   flink-streaming-java
+   
+
+   
+   org.apache.flink
+   flink-table-api-java-bridge
+   
+
+   
+   org.apache.flink
+   flink-table-common
+   
+
+   
+   org.apache.flink
+   flink-table-planner-loader
+   
+
+   

(flink-connector-kudu) 39/44: [FLINK-34930] Rename base package

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 34cb67fd62aefc03ed67ee75731ef2a2c82b1fff
Author: Ferenc Csaky 
AuthorDate: Wed Apr 10 16:27:19 2024 +0200

[FLINK-34930] Rename base package
---
 .../kudu/connector/ColumnSchemasFactory.java   |  2 +-
 .../kudu/connector/CreateTableOptionsFactory.java  |  2 +-
 .../kudu/connector/KuduFilterInfo.java |  2 +-
 .../kudu/connector/KuduTableInfo.java  |  2 +-
 .../connector/converter/RowResultConverter.java|  2 +-
 .../connector/converter/RowResultRowConverter.java |  2 +-
 .../converter/RowResultRowDataConverter.java   |  2 +-
 .../failure/DefaultKuduFailureHandler.java |  2 +-
 .../kudu/connector/failure/KuduFailureHandler.java |  5 ++-
 .../kudu/connector/reader/KuduInputSplit.java  |  2 +-
 .../kudu/connector/reader/KuduReader.java  |  8 ++--
 .../kudu/connector/reader/KuduReaderConfig.java|  7 ++--
 .../kudu/connector/reader/KuduReaderIterator.java  |  4 +-
 .../writer/AbstractSingleOperationMapper.java  |  2 +-
 .../kudu/connector/writer/KuduOperationMapper.java |  2 +-
 .../kudu/connector/writer/KuduWriter.java  |  8 ++--
 .../kudu/connector/writer/KuduWriterConfig.java|  9 +++--
 .../kudu/connector/writer/PojoOperationMapper.java |  2 +-
 .../writer/RowDataUpsertOperationMapper.java   |  2 +-
 .../kudu/connector/writer/RowOperationMapper.java  |  2 +-
 .../connector/writer/TupleOperationMapper.java |  2 +-
 .../kudu/format/AbstractKuduInputFormat.java   | 18 -
 .../kudu/format/KuduOutputFormat.java  | 14 +++
 .../kudu/format/KuduRowDataInputFormat.java| 10 ++---
 .../kudu/format/KuduRowInputFormat.java| 10 ++---
 .../kudu/streaming/KuduSink.java   | 14 +++
 .../kudu/table/AbstractReadOnlyCatalog.java|  2 +-
 .../kudu/table/KuduCatalog.java| 25 ++---
 .../kudu/table/KuduTableFactory.java   | 10 ++---
 .../kudu/table/KuduTableSink.java  |  8 ++--
 .../kudu/table/KuduTableSource.java| 17 -
 .../kudu/table/UpsertOperationMapper.java  |  4 +-
 .../kudu/table/dynamic/KuduDynamicTableSink.java   | 10 ++---
 .../kudu/table/dynamic/KuduDynamicTableSource.java | 20 +-
 .../dynamic/KuduDynamicTableSourceSinkFactory.java | 12 +++---
 .../table/dynamic/catalog/KuduCatalogFactory.java  |  6 +--
 .../table/dynamic/catalog/KuduDynamicCatalog.java  | 18 -
 .../table/function/lookup/KuduLookupOptions.java   |  2 +-
 .../function/lookup/KuduRowDataLookupFunction.java | 20 +-
 .../kudu/table/utils/KuduTableUtils.java   | 43 +-
 .../kudu/table/utils/KuduTypeUtils.java|  2 +-
 .../org.apache.flink.table.factories.Factory   |  2 +-
 .../org.apache.flink.table.factories.TableFactory  |  4 +-
 .../kudu/connector/KuduFilterInfoTest.java |  2 +-
 .../kudu/connector/KuduTestBase.java   | 22 +--
 .../kudu/format/KuduOutputFormatTest.java  | 12 +++---
 .../kudu/format/KuduRowDataInputFormatTest.java| 16 
 .../kudu/format/KuduRowInputFormatTest.java| 12 +++---
 .../kudu/streaming/KuduSinkTest.java   | 12 +++---
 .../kudu/table/KuduCatalogTest.java| 14 +++
 .../kudu/table/KuduTableFactoryTest.java   |  8 ++--
 .../kudu/table/KuduTableSourceITCase.java  |  6 +--
 .../kudu/table/KuduTableSourceTest.java|  6 +--
 .../kudu/table/KuduTableTestUtils.java |  2 +-
 .../kudu/table/dynamic/KuduDynamicSinkTest.java|  6 +--
 .../kudu/table/dynamic/KuduDynamicSourceTest.java  |  6 +--
 .../dynamic/KuduRowDataLookupFunctionTest.java | 12 +++---
 .../kudu/writer/AbstractOperationTest.java |  4 +-
 .../kudu/writer/PojoOperationMapperTest.java   | 15 
 .../writer/RowDataUpsertOperationMapperTest.java   |  6 +--
 .../kudu/writer/RowOperationMapperTest.java|  8 ++--
 .../kudu/writer/TupleOperationMapperTest.java  |  8 ++--
 62 files changed, 264 insertions(+), 255 deletions(-)

diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/ColumnSchemasFactory.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connector/kudu/connector/ColumnSchemasFactory.java
similarity index 96%
rename from 
flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/ColumnSchemasFactory.java
rename to 
flink-connector-kudu/src/main/java/org/apache/flink/connector/kudu/connector/ColumnSchemasFactory.java
index 194af17..95fe97d 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/ColumnSchemasFactory.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connector/kudu/connector

(flink-connector-kudu) 28/44: [BAHIR-302] Group declaration of flink dependencies on parent pom

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 285a6e133745d462fb1773d203026594a4cce802
Author: Joao Boto 
AuthorDate: Mon May 2 20:00:15 2022 +0200

[BAHIR-302] Group declaration of flink dependencies on parent pom
---
 flink-connector-kudu/pom.xml | 24 
 1 file changed, 24 deletions(-)

diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index 9ace382..6f68522 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -36,30 +36,6 @@
 
   
 
-  
-org.apache.flink
-flink-clients_${scala.binary.version}
-${flink.version}
-test
-  
-  
-org.apache.flink
-flink-streaming-java_${scala.binary.version}
-${flink.version}
-provided
-  
-  
-org.apache.flink
-flink-table-common
-${flink.version}
-provided
-  
-  
-org.apache.flink
-flink-table-planner_${scala.binary.version}
-${flink.version}
-provided
-  
   
 org.apache.kudu
 kudu-binary



(flink-connector-kudu) 35/44: [BAHIR-308] Remove scala prefix where we can

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 78c76c544ab426461e9e4156ebb9ddf561b7b97c
Author: Joao Boto 
AuthorDate: Thu Mar 23 13:30:24 2023 +0100

[BAHIR-308] Remove scala prefix where we can
---
 flink-connector-kudu/pom.xml   | 14 +++---
 .../kudu/table/dynamic/KuduDynamicTableSource.java |  3 +--
 2 files changed, 12 insertions(+), 5 deletions(-)

diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index 134d6f7..8dc4f88 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -20,12 +20,12 @@
 
   
 org.apache.bahir
-bahir-flink-parent_2.12
+bahir-flink-parent
 1.2-SNAPSHOT
 ../pom.xml
   
 
-  flink-connector-kudu_2.12
+  flink-connector-kudu
   jar
 
   flink-connector-kudu
@@ -84,13 +84,21 @@
   org.apache.flink
   flink-streaming-java
 
+
+  org.apache.flink
+  flink-table-api-java-bridge
+
 
   org.apache.flink
   flink-table-common
 
 
   org.apache.flink
-  flink-table-planner_${scala.binary.version}
+  flink-table-planner-loader
+
+
+  org.apache.flink
+  flink-table-runtime
 
 
   org.apache.kudu
diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/dynamic/KuduDynamicTableSource.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/dynamic/KuduDynamicTableSource.java
index cde6a13..da6bebb 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/dynamic/KuduDynamicTableSource.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/dynamic/KuduDynamicTableSource.java
@@ -43,7 +43,6 @@ import org.slf4j.LoggerFactory;
 
 import java.util.*;
 
-import static 
org.apache.flink.calcite.shaded.com.google.common.base.Preconditions.checkArgument;
 import static 
org.apache.flink.table.utils.TableSchemaUtils.containsPhysicalColumnsOnly;
 
 /**
@@ -143,7 +142,7 @@ public class KuduDynamicTableSource implements 
ScanTableSource, SupportsProjecti
 }
 
 private TableSchema projectSchema(TableSchema tableSchema, int[][] 
projectedFields) {
-checkArgument(
+Preconditions.checkArgument(
 containsPhysicalColumnsOnly(tableSchema),
 "Projection is only supported for physical columns.");
 TableSchema.Builder builder = TableSchema.builder();



(flink-connector-kudu) 20/44: [BAHIR-263] Update flink to 1.12.2 (#115)

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 4a5ec773266d76568d89b5f291861307f64810fb
Author: Joao Boto 
AuthorDate: Fri Mar 12 08:53:06 2021 +0100

[BAHIR-263] Update flink to 1.12.2 (#115)
---
 .../connectors/kudu/table/KuduCatalogTest.java | 16 ++---
 .../kudu/table/KuduTableFactoryTest.java   | 12 +-
 .../connectors/kudu/table/KuduTableSourceTest.java |  2 +-
 .../src/test/resources/log4j.properties| 27 ++
 4 files changed, 42 insertions(+), 15 deletions(-)

diff --git 
a/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/table/KuduCatalogTest.java
 
b/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/table/KuduCatalogTest.java
index 4bb1871..2bc8b12 100644
--- 
a/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/table/KuduCatalogTest.java
+++ 
b/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/table/KuduCatalogTest.java
@@ -77,7 +77,7 @@ public class KuduCatalogTest extends KuduTestBase {
 tableEnv.executeSql("INSERT INTO TestTable1 VALUES ('f', 's')")
 .getJobClient()
 .get()
-.getJobExecutionResult(getClass().getClassLoader())
+.getJobExecutionResult()
 .get(1, TimeUnit.MINUTES);
 
 // Add this once Primary key support has been enabled
@@ -101,7 +101,7 @@ public class KuduCatalogTest extends KuduTestBase {
 tableEnv.executeSql("INSERT INTO TestTable3 VALUES ('f', 2, 't')")
 .getJobClient()
 .get()
-.getJobExecutionResult(getClass().getClassLoader())
+.getJobExecutionResult()
 .get(1, TimeUnit.MINUTES);
 
 validateMultiKey("TestTable3");
@@ -113,14 +113,14 @@ public class KuduCatalogTest extends KuduTestBase {
 tableEnv.executeSql("INSERT INTO TestTable5 VALUES ('s', 'f', 't')")
 .getJobClient()
 .get()
-.getJobExecutionResult(getClass().getClassLoader())
+.getJobExecutionResult()
 .get(1, TimeUnit.MINUTES);
 
 tableEnv.executeSql("CREATE TABLE TestTable6 (`first` STRING, `second` 
String) WITH ('kudu.hash-columns' = 'first', 'kudu.primary-key-columns' = 
'first')");
 tableEnv.executeSql("INSERT INTO TestTable6 (SELECT `first`, `second` 
FROM  TestTable5)")
 .getJobClient()
 .get()
-.getJobExecutionResult(getClass().getClassLoader())
+.getJobExecutionResult()
 .get(1, TimeUnit.MINUTES);
 
 validateSingleKey("TestTable6");
@@ -133,12 +133,12 @@ public class KuduCatalogTest extends KuduTestBase {
 tableEnv.executeSql("INSERT INTO TestTableEP VALUES ('f','s')")
 .getJobClient()
 .get()
-.getJobExecutionResult(getClass().getClassLoader())
+.getJobExecutionResult()
 .get(1, TimeUnit.MINUTES);
 tableEnv.executeSql("INSERT INTO TestTableEP VALUES ('f2','s2')")
 .getJobClient()
 .get()
-.getJobExecutionResult(getClass().getClassLoader())
+.getJobExecutionResult()
 .get(1, TimeUnit.MINUTES);
 
 Table result = tableEnv.sqlQuery("SELECT COUNT(*) FROM TestTableEP");
@@ -225,7 +225,7 @@ public class KuduCatalogTest extends KuduTestBase {
 tableEnv.executeSql("INSERT INTO TestTableTsC values ('f', TIMESTAMP 
'2020-01-01 12:12:12.123456')")
 .getJobClient()
 .get()
-.getJobExecutionResult(getClass().getClassLoader())
+.getJobExecutionResult()
 .get(1, TimeUnit.MINUTES);
 
 KuduTable kuduTable = harness.getClient().openTable("TestTableTsC");
@@ -252,7 +252,7 @@ public class KuduCatalogTest extends KuduTestBase {
 "TIMESTAMP '2020-04-15 12:34:56.123') ")
 .getJobClient()
 .get()
-.getJobExecutionResult(getClass().getClassLoader())
+.getJobExecutionResult()
 .get(1, TimeUnit.MINUTES);
 
 validateManyTypes("TestTable8");
diff --git 
a/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/table/KuduTableFactoryTest.java
 
b/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/table/KuduTableFactoryTest.java
index d852f8e..d4de7f6 100644
--- 
a/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/table/KuduTableFactoryTest.java
+++ 
b/flink-connector-kudu/src/test/ja

(flink-connector-kudu) 05/44: [BAHIR-194] bump kudu version to 1.8.0

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit f0cd1d9243abb110ce312a82adb106afe233e078
Author: Joao Boto 
AuthorDate: Sun Feb 10 23:38:43 2019 +0100

[BAHIR-194] bump kudu version to 1.8.0

Closes 44
---
 flink-connector-kudu/pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index 61ab4a6..5540fc1 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -30,7 +30,7 @@
   jar
 
   
-1.7.1
+1.8.0
 
 !DockerTest
   



(flink-connector-kudu) 36/44: [BAHIR-324] Closing KuduReader at JobManager

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit d8c803af8921d4c26865143f5ea3976f43a163f2
Author: Shimin Huang <40719512+coll...@users.noreply.github.com>
AuthorDate: Wed May 31 00:37:14 2023 +0800

[BAHIR-324] Closing KuduReader at JobManager
---
 .../kudu/format/AbstractKuduInputFormat.java   | 22 +++---
 .../function/lookup/KuduRowDataLookupFunction.java |  2 +-
 2 files changed, 16 insertions(+), 8 deletions(-)

diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/format/AbstractKuduInputFormat.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/format/AbstractKuduInputFormat.java
index 0cdf570..4976241 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/format/AbstractKuduInputFormat.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/format/AbstractKuduInputFormat.java
@@ -104,19 +104,23 @@ public abstract class AbstractKuduInputFormat extends 
RichInputFormat extends 
RichInputFormat {
 ArrayList rows = new ArrayList<>();
 for (KuduInputSplit inputSplit : inputSplits) {
 KuduReaderIterator scanner = 
kuduReader.scanner(inputSplit.getScanToken());
-// 没有启用cache
+// not use cache
 if (cache == null) {
 while (scanner.hasNext()) {
 collect(scanner.next());



(flink-connector-kudu) 19/44: [BAHIR-241] Upgrade all connectors to Flink 1.11 (#99)

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 7b5d717b609ca68cd8e812773585c2c295946619
Author: Gyula Fora 
AuthorDate: Thu Sep 24 20:16:10 2020 +0200

[BAHIR-241] Upgrade all connectors to Flink 1.11 (#99)

Co-authored-by: Gyula Fora 
---
 flink-connector-kudu/pom.xml   | 32 ---
 .../kudu/connector/reader/KuduReader.java  |  2 +-
 .../kudu/connector/writer/KuduWriter.java  |  2 +-
 .../connectors/kudu/table/KuduTableFactory.java| 55 +++-
 .../flink/connectors/kudu/table/KuduTableSink.java |  3 -
 ...utFormatTest.java => KuduOutputFormatTest.java} |  4 +-
 .../connectors/kudu/streaming/KuduSinkTest.java|  2 +-
 .../connectors/kudu/table/KuduCatalogTest.java | 98 +-
 .../kudu/table/KuduTableFactoryTest.java   | 89 +++-
 .../kudu/table/KuduTableSourceITCase.java  | 16 ++--
 .../connectors/kudu/table/KuduTableSourceTest.java |  5 +-
 .../connectors/kudu/table/KuduTableTestUtils.java  |  2 +-
 .../{log4j.properties => log4j2-test.properties}   | 17 ++--
 13 files changed, 170 insertions(+), 157 deletions(-)

diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index bbf168e..a76102e 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -30,7 +30,7 @@
   jar
 
   
-1.11.1
+1.13.0
 1.10.19
   
 
@@ -79,6 +79,13 @@
   test
 
 
+
+org.apache.flink
+flink-clients_${scala.binary.version}
+${flink.version}
+test
+
+
 
 
   org.junit.jupiter
@@ -95,17 +102,22 @@
 
 
 
-  org.slf4j
-  slf4j-log4j12
-  ${slf4j.version}
-  runtime
+  org.apache.logging.log4j
+  log4j-api
+  ${log4j2.version}
+   test
 
-
 
-  log4j
-  log4j
-  ${log4j.version}
-  runtime
+  org.apache.logging.log4j
+  log4j-core
+  ${log4j2.version}
+   test
+
+
+  org.apache.logging.log4j
+  log4j-slf4j-impl
+  ${log4j2.version}
+  test
 
 
 
diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/reader/KuduReader.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/reader/KuduReader.java
index 51ab748..d7a0c61 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/reader/KuduReader.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/reader/KuduReader.java
@@ -82,7 +82,7 @@ public class KuduReader implements AutoCloseable {
 if (tableInfo.getCreateTableIfNotExists()) {
 return client.createTable(tableName, tableInfo.getSchema(), 
tableInfo.getCreateTableOptions());
 }
-throw new UnsupportedOperationException("table not exists and is 
marketed to not be created");
+throw new RuntimeException("Table " + tableName + " does not exist.");
 }
 
 public KuduReaderIterator scanner(byte[] token) throws IOException {
diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/writer/KuduWriter.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/writer/KuduWriter.java
index 7233478..03c37ea 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/writer/KuduWriter.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/writer/KuduWriter.java
@@ -83,7 +83,7 @@ public class KuduWriter implements AutoCloseable {
 if (tableInfo.getCreateTableIfNotExists()) {
 return client.createTable(tableName, tableInfo.getSchema(), 
tableInfo.getCreateTableOptions());
 }
-throw new UnsupportedOperationException("table not exists and is 
marketed to not be created");
+throw new RuntimeException("Table " + tableName + " does not exist.");
 }
 
 public void write(T input) throws IOException {
diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/KuduTableFactory.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/KuduTableFactory.java
index eb72205..1961aad 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/KuduTableFactory.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/KuduTableFactory.java
@@ -30,7 +30,6 @@ import org.apache.flink.table.descriptors.SchemaValidator;
 import org.apache.flink.table.factories.TableSinkFactory;
 import org.apache.flink.table.factories.TableSourceFactory;
 import org.apache.flink.types.Row;
-import static org.apache.flink.util.Preconditions.checkNotNull;
 
 import java.uti

(flink-connector-kudu) 32/44: [BAHIR-321] Make KuduFilterInfo handle String literals

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 2e75a60111bd48d670e94b8b6fc1b2eb52fd70fe
Author: Balazs Varga 
AuthorDate: Tue Jan 3 18:07:55 2023 +0100

[BAHIR-321] Make KuduFilterInfo handle String literals
---
 .../connectors/kudu/connector/KuduFilterInfo.java  |  4 +-
 .../function/lookup/KuduRowDataLookupFunction.java |  8 +--
 .../kudu/connector/KuduFilterInfoTest.java | 41 
 .../connectors/kudu/connector/KuduTestBase.java|  2 +-
 .../kudu/table/dynamic/KuduDynamicSourceTest.java  | 57 ++
 5 files changed, 107 insertions(+), 5 deletions(-)

diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduFilterInfo.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduFilterInfo.java
index 08fa86b..e7a8d16 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduFilterInfo.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduFilterInfo.java
@@ -17,6 +17,7 @@
 package org.apache.flink.connectors.kudu.connector;
 
 import org.apache.flink.annotation.PublicEvolving;
+import org.apache.flink.table.data.binary.BinaryStringData;
 
 import org.apache.kudu.ColumnSchema;
 import org.apache.kudu.Schema;
@@ -65,7 +66,8 @@ public class KuduFilterInfo implements Serializable {
 
 switch (column.getType()) {
 case STRING:
-predicate = KuduPredicate.newComparisonPredicate(column, 
comparison, (String) this.value);
+predicate = KuduPredicate.newComparisonPredicate(column, 
comparison,
+(this.value instanceof BinaryStringData) ? 
this.value.toString() : (String) this.value);
 break;
 case FLOAT:
 predicate = KuduPredicate.newComparisonPredicate(column, 
comparison, (float) this.value);
diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/function/lookup/KuduRowDataLookupFunction.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/function/lookup/KuduRowDataLookupFunction.java
index 4a4a952..d54f23a 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/function/lookup/KuduRowDataLookupFunction.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/function/lookup/KuduRowDataLookupFunction.java
@@ -178,9 +178,11 @@ public class KuduRowDataLookupFunction extends 
TableFunction {
 if (null != this.kuduReader) {
 try {
 this.kuduReader.close();
-this.cache.cleanUp();
-// help gc
-this.cache = null;
+if (cache != null) {
+this.cache.cleanUp();
+// help gc
+this.cache = null;
+}
 this.kuduReader = null;
 } catch (IOException e) {
 // ignore exception when close.
diff --git 
a/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/connector/KuduFilterInfoTest.java
 
b/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/connector/KuduFilterInfoTest.java
new file mode 100644
index 000..a6c2ff7
--- /dev/null
+++ 
b/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/connector/KuduFilterInfoTest.java
@@ -0,0 +1,41 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.flink.connectors.kudu.connector;
+
+import org.apache.flink.table.data.binary.BinaryStringData;
+
+import org.apache.kudu.ColumnSchema;
+import org.apache.kudu.Type;
+import org.junit.jupiter.api.Test;
+
+import static org.junit.jupiter.api.Assertions.assertDoesNotThrow;
+
+public class KuduFilterInfoTest {
+
+@Test
+void testKuduFilterInfoWithBinaryStringData() {
+String filterValue = "someValue";
+
+KuduFilterInfo kuduFilterInfo = KuduFilterInfo.Builder.create(&q

(flink-connector-kudu) 33/44: [BAHIR-308] Remove support for scala 2.11

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 5a1da3182cbf2c2afc0800a9da750c031a9915b2
Author: Joao Boto 
AuthorDate: Wed Jul 6 16:42:06 2022 +0200

[BAHIR-308] Remove support for scala 2.11
---
 flink-connector-kudu/pom.xml | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index fd485d5..20b16b4 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -20,12 +20,12 @@
 
   
 org.apache.bahir
-bahir-flink-parent_2.11
+bahir-flink-parent_2.12
 1.2-SNAPSHOT
 ../pom.xml
   
 
-  flink-connector-kudu_2.11
+  flink-connector-kudu_2.12
   jar
 
   flink-connector-kudu



(flink-connector-kudu) 26/44: [BAHIR-302] Add enforcers

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit e9c2a761dc5461e7fe229d837daf412a3e25fd99
Author: Joao Boto 
AuthorDate: Mon Mar 28 17:18:05 2022 +0200

[BAHIR-302] Add enforcers
---
 flink-connector-kudu/pom.xml | 143 ---
 1 file changed, 92 insertions(+), 51 deletions(-)

diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index 750929c..2a2dde2 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -22,99 +22,140 @@
 org.apache.bahir
 bahir-flink-parent_2.11
 1.1-SNAPSHOT
-..
+../pom.xml
   
 
   flink-connector-kudu_2.11
-  flink-connector-kudu
   jar
 
+  flink-connector-kudu
+
   
 1.13.0
   
 
-  
-
-
-  org.apache.kudu
-  kudu-client
-  ${kudu.version}
-
+  
+
+  
+org.apache.flink
+flink-clients_${scala.binary.version}
+${flink.version}
+test
+  
+  
+org.apache.flink
+flink-streaming-java_${scala.binary.version}
+${flink.version}
+provided
+  
+  
+org.apache.flink
+flink-table-common
+${flink.version}
+provided
+  
+  
+org.apache.flink
+flink-table-planner_${scala.binary.version}
+${flink.version}
+provided
+  
+  
+org.apache.kudu
+kudu-binary
+${kudu.version}
+${os.detected.classifier}
+test
+  
+  
+org.apache.kudu
+kudu-client
+${kudu.version}
+  
+  
+org.apache.kudu
+kudu-test-utils
+${kudu.version}
+test
+  
+  
+org.apache.logging.log4j
+log4j-api
+${log4j.version}
+test
+  
+  
+org.apache.logging.log4j
+log4j-core
+${log4j.version}
+test
+  
+  
+org.apache.logging.log4j
+log4j-slf4j-impl
+${log4j.version}
+test
+  
+  
+  
+org.junit.jupiter
+junit-jupiter-migrationsupport
+${junit.jupiter.version}
+test
+  
+
+  
 
+  
 
   org.apache.flink
-  flink-streaming-java_${scala.binary.version}
-  ${flink.version}
-  provided
+  flink-clients_${scala.binary.version}
 
-
 
   org.apache.flink
-  flink-table-planner_${scala.binary.version}
-  ${flink.version}
-  provided
+  flink-streaming-java_${scala.binary.version}
 
-
 
   org.apache.flink
   flink-table-common
-  ${flink.version}
-  provided
 
-
-
 
-  org.apache.kudu
-  kudu-test-utils
-  ${kudu.version}
-  test
+  org.apache.flink
+  flink-table-planner_${scala.binary.version}
 
-
 
   org.apache.kudu
   kudu-binary
-  ${kudu.version}
   ${os.detected.classifier}
-  test
-
-
-
-org.apache.flink
-flink-clients_${scala.binary.version}
-${flink.version}
-test
 
-
-
 
-  org.junit.jupiter
-  junit-jupiter-migrationsupport
-  ${junit.jupiter.version}
-  test
+  org.apache.kudu
+  kudu-client
 
-
 
-  org.mockito
-  mockito-all
+  org.apache.kudu
+  kudu-test-utils
 
-
 
   org.apache.logging.log4j
   log4j-api
-  ${log4j.version}
-   test
 
 
   org.apache.logging.log4j
   log4j-core
-  ${log4j.version}
-   test
 
 
   org.apache.logging.log4j
   log4j-slf4j-impl
-  ${log4j.version}
-  test
+
+
+
+  org.junit.jupiter
+  junit-jupiter-migrationsupport
+
+
+  org.mockito
+  mockito-all
 
   
 



(flink-connector-kudu) 21/44: [BAHIR-260] Add kudu table writer config (#109)

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 65004752884890ee49dbffd5bfb52a2a863758ed
Author: hackergin 
AuthorDate: Wed Apr 7 06:35:07 2021 -0500

[BAHIR-260] Add kudu table writer config  (#109)
---
 .../connectors/kudu/connector/KuduTableInfo.java   |  18 
 .../kudu/connector/writer/KuduWriter.java  |   5 +
 .../kudu/connector/writer/KuduWriterConfig.java| 113 -
 .../connectors/kudu/table/KuduTableFactory.java|  49 -
 .../flink/connectors/kudu/table/KuduTableSink.java |  21 
 .../kudu/table/KuduTableFactoryTest.java   |  44 +++-
 6 files changed, 243 insertions(+), 7 deletions(-)

diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduTableInfo.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduTableInfo.java
index 83c7dde..baae8a0 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduTableInfo.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduTableInfo.java
@@ -23,6 +23,7 @@ import org.apache.kudu.Schema;
 import org.apache.kudu.client.CreateTableOptions;
 
 import java.io.Serializable;
+import java.util.Objects;
 
 /**
  * Describes which table should be used in sources and sinks along with 
specifications
@@ -103,4 +104,21 @@ public class KuduTableInfo implements Serializable {
 }
 return createTableOptionsFactory.getCreateTableOptions();
 }
+
+@Override
+public int hashCode() {
+return Objects.hash(name);
+}
+
+@Override
+public boolean equals(Object o) {
+if (this == o) {
+return true;
+}
+if (o == null || getClass() != o.getClass()) {
+return false;
+}
+KuduTableInfo that = (KuduTableInfo) o;
+return Objects.equals(this.name, that.name);
+}
 }
diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/writer/KuduWriter.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/writer/KuduWriter.java
index 03c37ea..59ad196 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/writer/KuduWriter.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/writer/KuduWriter.java
@@ -72,6 +72,11 @@ public class KuduWriter implements AutoCloseable {
 private KuduSession obtainSession() {
 KuduSession session = client.newSession();
 session.setFlushMode(writerConfig.getFlushMode());
+session.setTimeoutMillis(writerConfig.getOperationTimeout());
+session.setMutationBufferSpace(writerConfig.getMaxBufferSize());
+session.setFlushInterval(writerConfig.getFlushInterval());
+session.setIgnoreAllDuplicateRows(writerConfig.isIgnoreDuplicate());
+session.setIgnoreAllNotFoundRows(writerConfig.isIgnoreNotFound());
 return session;
 }
 
diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/writer/KuduWriterConfig.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/writer/KuduWriterConfig.java
index 598f8d0..ff93921 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/writer/KuduWriterConfig.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/writer/KuduWriterConfig.java
@@ -19,8 +19,10 @@ package org.apache.flink.connectors.kudu.connector.writer;
 import org.apache.flink.annotation.PublicEvolving;
 
 import org.apache.commons.lang3.builder.ToStringBuilder;
+import org.apache.kudu.client.AsyncKuduClient;
 
 import java.io.Serializable;
+import java.util.Objects;
 
 import static org.apache.flink.util.Preconditions.checkNotNull;
 import static org.apache.kudu.client.SessionConfiguration.FlushMode;
@@ -34,13 +36,28 @@ public class KuduWriterConfig implements Serializable {
 
 private final String masters;
 private final FlushMode flushMode;
+private final long operationTimeout;
+private int maxBufferSize;
+private int flushInterval;
+private boolean ignoreNotFound;
+private boolean ignoreDuplicate;
 
 private KuduWriterConfig(
 String masters,
-FlushMode flushMode) {
+FlushMode flushMode,
+long operationTimeout,
+int maxBufferSize,
+int flushInterval,
+boolean ignoreNotFound,
+boolean ignoreDuplicate) {
 
 this.masters = checkNotNull(masters, "Kudu masters cannot be null");
 this.flushMode = checkNotNull(flushMode, "Kudu flush mode cannot be 
null");
+this.operationTimeout = operationTimeout;
+  

(flink-connector-kudu) 12/44: [BAHIR-214] Improve speed and solve eventual consistence issues (#64)

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 73e87f9c2e9d3c5ef59bbe723a930b2cf5cffc1d
Author: Joao Boto 
AuthorDate: Tue Sep 3 02:05:56 2019 +0200

[BAHIR-214] Improve speed and solve eventual consistence issues (#64)

* resolve eventual consistency issues
* improve speed special on eventual consistency stream
* Update Readme
---
 flink-connector-kudu/README.md |  60 +++---
 .../connectors/kudu/batch/KuduInputFormat.java | 137 +
 .../connectors/kudu/batch/KuduOutputFormat.java|  93 +
 .../connectors/kudu/connector/KuduColumnInfo.java  |   4 +-
 .../connectors/kudu/connector/KuduFilterInfo.java  |   5 +-
 .../connectors/kudu/connector/KuduRow.java |   4 +-
 .../connectors/kudu/connector/KuduTableInfo.java   |   4 +-
 .../failure/DefaultKuduFailureHandler.java |  33 
 .../kudu/connector/failure/KuduFailureHandler.java |  37 
 .../kudu/connector/reader/KuduInputSplit.java  |  39 
 .../kudu/connector/reader/KuduReader.java  | 170 +
 .../kudu/connector/reader/KuduReaderConfig.java|  82 
 .../kudu/connector/reader/KuduReaderIterator.java  | 112 +++
 .../kudu/connector}/serde/DefaultSerDe.java|   4 +-
 .../kudu/connector}/serde/KuduDeserialization.java |   4 +-
 .../kudu/connector}/serde/KuduSerialization.java   |   4 +-
 .../kudu/connector}/serde/PojoSerDe.java   |   6 +-
 .../kudu/connector/writer/KuduWriter.java  | 209 
 .../kudu/connector/writer/KuduWriterConfig.java| 113 +++
 .../connector/writer/KuduWriterConsistency.java|  32 
 .../kudu/connector/writer/KuduWriterMode.java} |  18 +-
 .../flink/connectors/kudu/streaming/KuduSink.java  |  89 +
 .../streaming/connectors/kudu/KuduInputFormat.java | 211 -
 .../connectors/kudu/KuduOutputFormat.java  | 121 
 .../flink/streaming/connectors/kudu/KuduSink.java  | 157 ---
 .../connectors/kudu/connector/KuduConnector.java   | 170 -
 .../connectors/kudu/connector/KuduMapper.java  | 143 --
 .../connectors/kudu/connector/KuduRowIterator.java |  57 --
 .../kudu/batch}/KuduInputFormatTest.java   |  41 ++--
 .../kudu/batch}/KuduOuputFormatTest.java   |  50 +++--
 .../connectors/kudu/connector/KuduDatabase.java|  48 -
 .../kudu/connector}/serde/PojoSerDeTest.java   |   8 +-
 .../connectors/kudu/streaming/KuduSinkTest.java| 159 
 .../streaming/connectors/kudu/KuduSinkTest.java| 109 ---
 34 files changed, 1467 insertions(+), 1066 deletions(-)

diff --git a/flink-connector-kudu/README.md b/flink-connector-kudu/README.md
index 9b75aa7..8692ca5 100644
--- a/flink-connector-kudu/README.md
+++ b/flink-connector-kudu/README.md
@@ -29,15 +29,19 @@ env.setParallelism(PARALLELISM);
 // create a table info object
 KuduTableInfo tableInfo = KuduTableInfo.Builder
 .create("books")
-.addColumn(KuduColumnInfo.Builder.create("id", 
Type.INT32).key(true).hashKey(true).build())
-.addColumn(KuduColumnInfo.Builder.create("title", Type.STRING).build())
-.addColumn(KuduColumnInfo.Builder.create("author", 
Type.STRING).build())
-.addColumn(KuduColumnInfo.Builder.create("price", Type.DOUBLE).build())
-.addColumn(KuduColumnInfo.Builder.create("quantity", 
Type.INT32).build())
+
.addColumn(KuduColumnInfo.Builder.createInteger("id").asKey().asHashKey().build())
+.addColumn(KuduColumnInfo.Builder.createString("title").build())
+.addColumn(KuduColumnInfo.Builder.createString("author").build())
+.addColumn(KuduColumnInfo.Builder.createDouble("price").build())
+.addColumn(KuduColumnInfo.Builder.createInteger("quantity").build())
 .build();
-
+// create a reader configuration
+KuduReaderConfig readerConfig = KuduReaderConfig.Builder
+.setMasters("172.25.0.6")
+.setRowLimit(1000)
+.build();
 // Pass the tableInfo to the KuduInputFormat and provide kuduMaster ips
-env.createInput(new KuduInputFormat<>("172.25.0.6", tableInfo))
+env.createInput(new KuduInputFormat<>(readerConfig, tableInfo, new 
DefaultSerDe()))
 .count();
 
 env.execute();
@@ -54,18 +58,23 @@ env.setParallelism(PARALLELISM);
 KuduTableInfo tableInfo = KuduTableInfo.Builder
 .create("books")
 .createIfNotExist(true)
-.replicas(1)
-.addColumn(KuduColumnInfo.Builder.create("id", 
Type.INT32).key(true).hashKey(true).build())
-.addColumn(KuduColumnInfo.Builder.create("title", Type.STRIN

(flink-connector-kudu) 31/44: [BAHIR-312] Add license header to README.md files

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit bef51ab2af1db28f82c4bea5864fd23202550bed
Author: Joao Boto 
AuthorDate: Sat Jul 30 17:45:39 2022 +0200

[BAHIR-312] Add license header to README.md files
---
 flink-connector-kudu/README.md | 16 
 1 file changed, 16 insertions(+)

diff --git a/flink-connector-kudu/README.md b/flink-connector-kudu/README.md
index c52adb1..7908e08 100644
--- a/flink-connector-kudu/README.md
+++ b/flink-connector-kudu/README.md
@@ -1,3 +1,19 @@
+ 
+
 # Flink Kudu Connector
 
 This connector provides a source (```KuduInputFormat```), a sink/output



(flink-connector-kudu) 27/44: [BAHIR-243] Change KuduTestHarness with TestContainers

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 6832f05fd4b73ef3583e24e9968ccbd0faa4
Author: Joao Boto 
AuthorDate: Thu Mar 11 19:03:43 2021 +0100

[BAHIR-243] Change KuduTestHarness with TestContainers
---
 flink-connector-kudu/pom.xml   |  4 +
 .../connectors/kudu/batch/KuduInputFormatTest.java |  4 +-
 .../kudu/batch/KuduOutputFormatTest.java   |  8 +-
 .../connectors/kudu/connector/KuduTestBase.java| 85 +-
 .../connectors/kudu/streaming/KuduSinkTest.java| 11 ++-
 .../connectors/kudu/table/KuduCatalogTest.java | 20 ++---
 .../kudu/table/KuduTableFactoryTest.java   | 10 +--
 .../kudu/table/KuduTableSourceITCase.java  |  2 +-
 .../connectors/kudu/table/KuduTableSourceTest.java |  2 +-
 9 files changed, 99 insertions(+), 47 deletions(-)

diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index 2a2dde2..9ace382 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -157,6 +157,10 @@
   org.mockito
   mockito-all
 
+
+  org.testcontainers
+  testcontainers
+
   
 
   
diff --git 
a/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/batch/KuduInputFormatTest.java
 
b/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/batch/KuduInputFormatTest.java
index fb7d8e3..126f7fd 100644
--- 
a/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/batch/KuduInputFormatTest.java
+++ 
b/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/batch/KuduInputFormatTest.java
@@ -39,7 +39,7 @@ class KuduInputFormatTest extends KuduTestBase {
 
 @Test
 void testInvalidTableInfo() {
-String masterAddresses = harness.getMasterAddressesAsString();
+String masterAddresses = getMasterAddress();
 KuduReaderConfig readerConfig = 
KuduReaderConfig.Builder.setMasters(masterAddresses).build();
 Assertions.assertThrows(NullPointerException.class, () -> new 
KuduRowInputFormat(readerConfig, null));
 }
@@ -71,7 +71,7 @@ class KuduInputFormatTest extends KuduTestBase {
 }
 
 private List readRows(KuduTableInfo tableInfo, String... 
fieldProjection) throws Exception {
-String masterAddresses = harness.getMasterAddressesAsString();
+String masterAddresses = getMasterAddress();
 KuduReaderConfig readerConfig = 
KuduReaderConfig.Builder.setMasters(masterAddresses).build();
 KuduRowInputFormat inputFormat = new KuduRowInputFormat(readerConfig, 
tableInfo, new ArrayList<>(),
 fieldProjection == null ? null : 
Arrays.asList(fieldProjection));
diff --git 
a/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/batch/KuduOutputFormatTest.java
 
b/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/batch/KuduOutputFormatTest.java
index 22fa0a4..693e113 100644
--- 
a/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/batch/KuduOutputFormatTest.java
+++ 
b/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/batch/KuduOutputFormatTest.java
@@ -39,14 +39,14 @@ class KuduOutputFormatTest extends KuduTestBase {
 
 @Test
 void testInvalidTableInfo() {
-String masterAddresses = harness.getMasterAddressesAsString();
+String masterAddresses = getMasterAddress();
 KuduWriterConfig writerConfig = 
KuduWriterConfig.Builder.setMasters(masterAddresses).build();
 Assertions.assertThrows(NullPointerException.class, () -> new 
KuduOutputFormat<>(writerConfig, null, null));
 }
 
 @Test
 void testNotTableExist() {
-String masterAddresses = harness.getMasterAddressesAsString();
+String masterAddresses = getMasterAddress();
 KuduTableInfo tableInfo = booksTableInfo(UUID.randomUUID().toString(), 
false);
 KuduWriterConfig writerConfig = 
KuduWriterConfig.Builder.setMasters(masterAddresses).build();
 KuduOutputFormat outputFormat = new 
KuduOutputFormat<>(writerConfig, tableInfo, new 
RowOperationMapper(KuduTestBase.columns, 
AbstractSingleOperationMapper.KuduOperation.INSERT));
@@ -55,7 +55,7 @@ class KuduOutputFormatTest extends KuduTestBase {
 
 @Test
 void testOutputWithStrongConsistency() throws Exception {
-String masterAddresses = harness.getMasterAddressesAsString();
+String masterAddresses = getMasterAddress();
 
 KuduTableInfo tableInfo = booksTableInfo(UUID.randomUUID().toString(), 
true);
 KuduWriterConfig writerConfig = KuduWriterConfig.Builder
@@ -80,7 +80,7 @@ class KuduOutputFormatTest extends KuduTestBase {
 
 @Test
 void testOutputWithEventualConsistency() throws Exception {
-String masterAddresses = harness.getMasterAddressesAsString();
+String masterAddr

(flink-connector-kudu) 01/44: [FLINK-34930] Initialize tools/ and vcs.xml

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 427098db43f40fde797efbeb16723eb4e7cef233
Author: Ferenc Csaky 
AuthorDate: Tue Mar 26 15:21:33 2024 +0100

[FLINK-34930] Initialize tools/ and vcs.xml
---
 .gitmodules  |   4 +
 .idea/vcs.xml|  25 ++
 tools/ci/log4j.properties|  43 
 tools/maven/checkstyle.xml   | 562 +++
 tools/maven/suppressions.xml |  26 ++
 5 files changed, 660 insertions(+)

diff --git a/.gitmodules b/.gitmodules
new file mode 100644
index 000..e5d40f3
--- /dev/null
+++ b/.gitmodules
@@ -0,0 +1,4 @@
+[submodule "tools/releasing/shared"]
+   path = tools/releasing/shared
+   url = https://github.com/apache/flink-connector-shared-utils
+   branch = release_utils
diff --git a/.idea/vcs.xml b/.idea/vcs.xml
new file mode 100644
index 000..31d1734
--- /dev/null
+++ b/.idea/vcs.xml
@@ -0,0 +1,25 @@
+
+
+  
+
+  
+
+  
+  https://issues.apache.org/jira/browse/$0; />
+
+
+  
+  https://cwiki.apache.org/confluence/display/FLINK/$0; />
+
+
+  
+  https://github.com/apache/flink-connector-kudu/pull/$1; />
+
+  
+
+  
+  
+
+
+  
+
diff --git a/tools/ci/log4j.properties b/tools/ci/log4j.properties
new file mode 100644
index 000..7daf1c3
--- /dev/null
+++ b/tools/ci/log4j.properties
@@ -0,0 +1,43 @@
+
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#  http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+rootLogger.level = INFO
+rootLogger.appenderRef.out.ref = ConsoleAppender
+
+# -
+# Console (use 'console')
+# -
+
+appender.console.name = ConsoleAppender
+appender.console.type = CONSOLE
+appender.console.layout.type = PatternLayout
+appender.console.layout.pattern = %d{HH:mm:ss,SSS} [%20t] %-5p %-60c %x - %m%n
+
+# -
+# File (use 'file')
+# -
+appender.file.name = FileAppender
+appender.file.type = FILE
+appender.file.fileName = ${sys:log.dir}/mvn-${sys:mvn.forkNumber:-output}.log
+appender.file.layout.type = PatternLayout
+appender.file.layout.pattern = %d{HH:mm:ss,SSS} [%20t] %-5p %-60c %x - %m%n
+appender.file.createOnDemand = true
+
+# suppress the irrelevant (wrong) warnings from the netty channel handler
+logger.netty.name = org.jboss.netty.channel.DefaultChannelPipeline
+logger.netty.level = ERROR
diff --git a/tools/maven/checkstyle.xml b/tools/maven/checkstyle.xml
new file mode 100644
index 000..2048fd1
--- /dev/null
+++ b/tools/maven/checkstyle.xml
@@ -0,0 +1,562 @@
+
+
+http://www.pu

(flink-connector-kudu) 10/44: [BAHIR-209] upgrade kudu version to 1.10.0 (#61)

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 02deb5127a163c32e75d2c3ed23614c0934f109e
Author: Joao Boto 
AuthorDate: Thu Jul 4 00:11:57 2019 +0200

[BAHIR-209] upgrade kudu version to 1.10.0 (#61)
---
 flink-connector-kudu/pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index a504b89..bd42c55 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -30,7 +30,7 @@
   jar
 
   
-1.9.0
+1.10.0
 
 1.10.19
 !DockerTest



(flink-connector-kudu) branch main updated (d837674 -> 8674446)

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git


from d837674  Initial commit
 new 427098d  [FLINK-34930] Initialize tools/ and vcs.xml
 new 663ffd1  [FLINK-34930] Remove Python CI
 new 37a3e55  [BAHIR-99] kudu connector
 new 701fabe  [BAHIR-179] Fail Docker integration tests silently
 new f0cd1d9  [BAHIR-194] bump kudu version to 1.8.0
 new 247dfe9  [BAHIR-180] Improve eventual consistence for Kudu connector
 new 1e26505  [BAHIR-199] Bump kudu version to 1.9.0
 new f9c75bb  [BAHIR-202] Improve KuduSink throughput using async FlushMode
 new a1b4da5  [BAHIR-200] Move tests from docker to kudu-test-utils (#49)
 new 02deb51  [BAHIR-209] upgrade kudu version to 1.10.0 (#61)
 new 47133a3  [BAHIR-207] Add tests for scala 2.12 on travis (#59)
 new 73e87f9  [BAHIR-214] Improve speed and solve eventual consistence 
issues (#64)
 new c380e40  Add support "upsert part of columns of a kudu table" (#70)
 new 9b99c18  Fix "the client has already been closed" bug (#75)
 new 6da6ad7  Fix NotSerializableException in Kudu connector (#74)
 new 3c57f2e  Kudu Connector rework (#78)
 new 6e8b66f  Add batch table env support and filter push down to Kudu 
connector (#82)
 new 3f9db63  BAHIR-240: replace docker test by testcontainer (#89)
 new 7b5d717  [BAHIR-241] Upgrade all connectors to Flink 1.11 (#99)
 new 4a5ec77  [BAHIR-263] Update flink to 1.12.2 (#115)
 new 6500475  [BAHIR-260] Add kudu table writer config  (#109)
 new 03f0312  [BAHIR-293] Fix documentation tables
 new b9cd23e  [BAHIR-291] Bump flink to 1.14.0 (#136)
 new 7700edb  [BAHIR-296] Unify log4j libs
 new 1dce829  BAHIR-296: Unify mockito version to 1.10.19
 new e9c2a76  [BAHIR-302] Add enforcers
 new 6832f05  [BAHIR-243] Change KuduTestHarness with TestContainers
 new 285a6e1  [BAHIR-302] Group declaration of flink dependencies on parent 
pom
 new af8dff4  [maven-release-plugin] prepare for next development iteration
 new aca6e7b  [BAHIR-305] Kudu Flink SQL Support 
DynamicSource/Sink
 new bef51ab  [BAHIR-312] Add license header to README.md files
 new 2e75a60  [BAHIR-321] Make KuduFilterInfo handle String literals
 new 5a1da31  [BAHIR-308] Remove support for scala 2.11
 new 6ff53e7  [BAHIR-308] Bump flink version to 1.15.3
 new 78c76c5  [BAHIR-308] Remove scala prefix where we can
 new d8c803a  [BAHIR-324] Closing KuduReader at JobManager
 new 0335544  [FLINK-34930] Adapt POM files to the new structure, remove 
flink-shaded Guava usage
 new abf6eba  [FLINK-34930] Apply spotless code format and checkstyle rules
 new 34cb67f  [FLINK-34930] Rename base package
 new abc4a3b  [FLINK-34930] Fix KuduTableSourceITCase
 new 9253b9d  [FLINK-34930] State Bahir fork
 new c7dfb67  [FLINK-34930] Skip spotless for JDK 21+
 new 5230ec1  [FLINK-34930] Enable module opens for tests for newer JDKs
 new 8674446  [FLINK-34930] Migrate Bahir NOTICE

The 44 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .github/workflows/push_pr.yml  |   7 -
 .gitmodules|   4 +
 .idea/vcs.xml  |  25 +
 NOTICE |   6 +-
 README.md  |   6 +-
 flink-connector-kudu/README.md | 324 
 flink-connector-kudu/pom.xml   |  97 
 .../kudu/connector/ColumnSchemasFactory.java   |  43 ++
 .../kudu/connector/CreateTableOptionsFactory.java  |  42 ++
 .../connector/kudu/connector/KuduFilterInfo.java   | 199 
 .../connector/kudu/connector/KuduTableInfo.java| 127 +
 .../connector/converter/RowResultConverter.java|  41 ++
 .../connector/converter/RowResultRowConverter.java |  46 ++
 .../converter/RowResultRowDataConverter.java   | 104 
 .../failure/DefaultKuduFailureHandler.java |  38 ++
 .../kudu/connector/failure/KuduFailureHandler.java |  54 ++
 .../kudu/connector/reader/KuduInputSplit.java  |  46 ++
 .../kudu/connector/reader/KuduReader.java  | 198 
 .../kudu/connector/reader/KuduReaderConfig.java|  89 
 .../kudu/connector/reader/KuduReaderIterator.java  |  68 +++
 .../writer/AbstractSingleOperationMapper.java  | 108 
 .../kudu/connector/writer/KuduOperationMapper.java |  46 ++
 .../kudu/connector/writer/KuduWriter.java  | 168 ++
 .../kudu/connector/writer/KuduWriterConfig.java| 206 
 .../kudu/connector/writer/PojoOpe

(flink-connector-kudu) 29/44: [maven-release-plugin] prepare for next development iteration

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit af8dff48d082bf8f340e181a8c5719993de47eef
Author: Joao Boto 
AuthorDate: Thu Jun 16 15:44:53 2022 +0200

[maven-release-plugin] prepare for next development iteration
---
 flink-connector-kudu/pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index 6f68522..fd485d5 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -21,7 +21,7 @@
   
 org.apache.bahir
 bahir-flink-parent_2.11
-1.1-SNAPSHOT
+1.2-SNAPSHOT
 ../pom.xml
   
 



(flink-connector-kudu) 09/44: [BAHIR-200] Move tests from docker to kudu-test-utils (#49)

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit a1b4da53f4a73eaa22e5dbb4d22e0ed897ec4f0a
Author: Joao Boto 
AuthorDate: Sun May 26 02:45:25 2019 +0200

[BAHIR-200] Move tests from docker to kudu-test-utils (#49)
---
 flink-connector-kudu/dockers/docker-compose.yml| 92 --
 flink-connector-kudu/dockers/role/Dockerfile   | 41 --
 .../dockers/role/docker-entrypoint.sh  | 69 
 flink-connector-kudu/dockers/run_kudu_tests.sh | 68 
 flink-connector-kudu/dockers/start-images.sh   | 42 --
 flink-connector-kudu/dockers/stop-images.sh| 33 
 flink-connector-kudu/pom.xml   | 42 +++---
 .../connectors/kudu/connector/KuduFilterInfo.java  |  4 +-
 .../streaming/connectors/kudu/DockerTest.java  | 31 
 .../connectors/kudu/KuduInputFormatTest.java   |  8 +-
 .../connectors/kudu/KuduOuputFormatTest.java   | 13 +--
 .../streaming/connectors/kudu/KuduSinkTest.java| 29 +--
 .../connectors/kudu/connector/KuduDatabase.java| 16 +++-
 13 files changed, 82 insertions(+), 406 deletions(-)

diff --git a/flink-connector-kudu/dockers/docker-compose.yml 
b/flink-connector-kudu/dockers/docker-compose.yml
deleted file mode 100644
index d2c95bb..000
--- a/flink-connector-kudu/dockers/docker-compose.yml
+++ /dev/null
@@ -1,92 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-version: '2'
-
-services:
-
-  kudu-master:
-image: eskabetxe/kudu
-container_name: kudu-master
-hostname: 172.25.0.6
-ports:
-  - "8051:8051"
-volumes:
-  - /var/lib/kudu/master
-command: master
-networks:
-  mynet:
-ipv4_address: 172.25.0.6
-
-  kudu-server1:
-image: eskabetxe/kudu
-container_name: kudu-server1
-hostname: 172.25.0.7
-environment:
-  - KUDU_MASTER=172.25.0.6
-ports:
-  - "8054:8050"
-volumes:
-  - /var/lib/kudu/server
-command: tserver
-networks:
-  mynet:
-ipv4_address: 172.25.0.7
-links:
-  - kudu-master
-
-  kudu-server2:
-image: eskabetxe/kudu
-container_name: kudu-server2
-hostname: 172.25.0.8
-environment:
-  - KUDU_MASTER=172.25.0.6
-ports:
-  - "8052:8050"
-volumes:
-  - /var/lib/kudu/server
-command: tserver
-networks:
-  mynet:
-ipv4_address: 172.25.0.8
-links:
-  - kudu-master
-
-  kudu-server3:
-image: eskabetxe/kudu
-container_name: kudu-server3
-hostname: 172.25.0.9
-environment:
-  - KUDU_MASTER=172.25.0.6
-ports:
-  - "8053:8050"
-volumes:
-  - /var/lib/kudu/server
-command: tserver
-networks:
-  mynet:
-ipv4_address: 172.25.0.9
-links:
-  - kudu-master
-
-networks:
-  mynet:
-driver: bridge
-ipam:
-  config:
-- subnet: 172.25.0.0/24
-  IPRange: 172.25.0.2/24,
-  gateway: 172.25.0.1
diff --git a/flink-connector-kudu/dockers/role/Dockerfile 
b/flink-connector-kudu/dockers/role/Dockerfile
deleted file mode 100644
index b14b087..000
--- a/flink-connector-kudu/dockers/role/Dockerfile
+++ /dev/null
@@ -1,41 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-
-FROM bitnami/minideb:j

(flink-connector-kudu) 08/44: [BAHIR-202] Improve KuduSink throughput using async FlushMode

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit f9c75bbc99f190fd562215ee1dfef53a36241cda
Author: SuXingLee <913910...@qq.com>
AuthorDate: Fri Mar 22 20:08:33 2019 +0800

[BAHIR-202] Improve KuduSink throughput using async FlushMode

By default, KuduSink processing message one by one
without checkpoint. When checkoint is enabled, throughput
is improved by using FlushMode.AUTO_FLUSH_BACKGROUND,
and use checkpoint to ensure at-least-once.

Closes #50
---
 .../connectors/kudu/KuduOutputFormat.java  |  5 +-
 .../flink/streaming/connectors/kudu/KuduSink.java  | 59 --
 .../connectors/kudu/connector/KuduConnector.java   | 18 +--
 3 files changed, 73 insertions(+), 9 deletions(-)

diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/streaming/connectors/kudu/KuduOutputFormat.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/streaming/connectors/kudu/KuduOutputFormat.java
index 9d12710..c1301da 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/streaming/connectors/kudu/KuduOutputFormat.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/streaming/connectors/kudu/KuduOutputFormat.java
@@ -23,6 +23,7 @@ import 
org.apache.flink.streaming.connectors.kudu.connector.KuduRow;
 import org.apache.flink.streaming.connectors.kudu.connector.KuduTableInfo;
 import org.apache.flink.streaming.connectors.kudu.serde.KuduSerialization;
 import org.apache.flink.util.Preconditions;
+import org.apache.kudu.client.SessionConfiguration.FlushMode;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
@@ -30,6 +31,8 @@ import java.io.IOException;
 
 public class KuduOutputFormat extends RichOutputFormat {
 
+private static final long serialVersionUID = 1L;
+
 private static final Logger LOG = 
LoggerFactory.getLogger(KuduOutputFormat.class);
 
 private String kuduMasters;
@@ -87,7 +90,7 @@ public class KuduOutputFormat extends 
RichOutputFormat {
 @Override
 public void open(int taskNumber, int numTasks) throws IOException {
 if (connector != null) return;
-connector = new KuduConnector(kuduMasters, tableInfo, consistency, 
writeMode);
+connector = new KuduConnector(kuduMasters, tableInfo, consistency, 
writeMode,FlushMode.AUTO_FLUSH_SYNC);
 serializer = serializer.withSchema(tableInfo.getSchema());
 }
 
diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/streaming/connectors/kudu/KuduSink.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/streaming/connectors/kudu/KuduSink.java
index 53cf249..b6dd9c8 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/streaming/connectors/kudu/KuduSink.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/streaming/connectors/kudu/KuduSink.java
@@ -17,18 +17,25 @@
 package org.apache.flink.streaming.connectors.kudu;
 
 import org.apache.flink.configuration.Configuration;
+import org.apache.flink.runtime.state.FunctionInitializationContext;
+import org.apache.flink.runtime.state.FunctionSnapshotContext;
+import org.apache.flink.streaming.api.checkpoint.CheckpointedFunction;
 import org.apache.flink.streaming.api.functions.sink.RichSinkFunction;
+import org.apache.flink.streaming.api.operators.StreamingRuntimeContext;
 import org.apache.flink.streaming.connectors.kudu.connector.KuduConnector;
 import org.apache.flink.streaming.connectors.kudu.connector.KuduRow;
 import org.apache.flink.streaming.connectors.kudu.connector.KuduTableInfo;
 import org.apache.flink.streaming.connectors.kudu.serde.KuduSerialization;
 import org.apache.flink.util.Preconditions;
+import org.apache.kudu.client.SessionConfiguration.FlushMode;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
 import java.io.IOException;
 
-public class KuduSink extends RichSinkFunction {
+public class KuduSink extends RichSinkFunction implements 
CheckpointedFunction {
+
+private static final long serialVersionUID = 1L;
 
 private static final Logger LOG = 
LoggerFactory.getLogger(KuduOutputFormat.class);
 
@@ -36,6 +43,7 @@ public class KuduSink extends RichSinkFunction {
 private KuduTableInfo tableInfo;
 private KuduConnector.Consistency consistency;
 private KuduConnector.WriteMode writeMode;
+private FlushMode flushMode;
 
 private KuduSerialization serializer;
 
@@ -77,11 +85,42 @@ public class KuduSink extends RichSinkFunction {
 return this;
 }
 
+public KuduSink withSyncFlushMode() {
+this.flushMode = FlushMode.AUTO_FLUSH_SYNC;
+return this;
+}
+
+public KuduSink withAsyncFlushMode() {
+this.flushMode = FlushMode.AUTO_FLUSH_BACKGROUND;
+return this;
+}
+
 @Override
 public void open(Configuration parameters) throws IOException {
-if (connector != null) 

(flink-connector-kudu) 23/44: [BAHIR-291] Bump flink to 1.14.0 (#136)

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit b9cd23e729e0d874b7fd53e86a7890f73b939567
Author: Roc Marshal 
AuthorDate: Mon Dec 27 01:27:08 2021 +0800

[BAHIR-291] Bump flink to 1.14.0 (#136)
---
 flink-connector-kudu/pom.xml   | 2 +-
 .../java/org/apache/flink/connectors/kudu/table/KuduCatalog.java   | 4 ++--
 .../org/apache/flink/connectors/kudu/table/KuduCatalogFactory.java | 7 ++-
 .../org/apache/flink/connectors/kudu/table/KuduTableFactory.java   | 2 +-
 .../org/apache/flink/connectors/kudu/table/KuduTableSource.java| 2 +-
 .../apache/flink/connectors/kudu/table/utils/KuduTableUtils.java   | 2 +-
 6 files changed, 8 insertions(+), 11 deletions(-)

diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index a76102e..ac6cdc5 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -51,7 +51,7 @@
 
 
   org.apache.flink
-  
flink-table-planner-blink_${scala.binary.version}
+  flink-table-planner_${scala.binary.version}
   ${flink.version}
   provided
 
diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/KuduCatalog.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/KuduCatalog.java
index 2ca7c0e..d8343e8 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/KuduCatalog.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/KuduCatalog.java
@@ -42,7 +42,7 @@ import org.apache.flink.table.expressions.Expression;
 import org.apache.flink.table.factories.TableFactory;
 import org.apache.flink.util.StringUtils;
 
-import org.apache.flink.shaded.guava18.com.google.common.collect.Lists;
+import org.apache.flink.shaded.guava30.com.google.common.collect.Lists;
 
 import org.apache.kudu.ColumnSchema;
 import org.apache.kudu.client.AlterTableOptions;
@@ -237,7 +237,7 @@ public class KuduCatalog extends AbstractReadOnlyCatalog {
 
 @Override
 public void createTable(ObjectPath tablePath, CatalogBaseTable table, 
boolean ignoreIfExists) throws TableAlreadyExistException {
-Map tableProperties = table.getProperties();
+Map tableProperties = table.getOptions();
 TableSchema tableSchema = table.getSchema();
 
 Set optionalProperties = new 
HashSet<>(Arrays.asList(KUDU_REPLICAS));
diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/KuduCatalogFactory.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/KuduCatalogFactory.java
index 30aaa40..2458a56 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/KuduCatalogFactory.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/KuduCatalogFactory.java
@@ -31,9 +31,6 @@ import java.util.HashMap;
 import java.util.List;
 import java.util.Map;
 
-import static 
org.apache.flink.table.descriptors.CatalogDescriptorValidator.CATALOG_PROPERTY_VERSION;
-import static 
org.apache.flink.table.descriptors.CatalogDescriptorValidator.CATALOG_TYPE;
-
 /**
  * Factory for {@link KuduCatalog}.
  */
@@ -45,8 +42,8 @@ public class KuduCatalogFactory implements CatalogFactory {
 @Override
 public Map requiredContext() {
 Map context = new HashMap<>();
-context.put(CATALOG_TYPE, KuduTableFactory.KUDU);
-context.put(CATALOG_PROPERTY_VERSION, "1"); // backwards compatibility
+context.put("type", KuduTableFactory.KUDU);
+context.put("property-version", "1"); // backwards compatibility
 return context;
 }
 
diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/KuduTableFactory.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/KuduTableFactory.java
index 524f521..a2883af 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/KuduTableFactory.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/table/KuduTableFactory.java
@@ -132,7 +132,7 @@ public class KuduTableFactory implements 
TableSourceFactory, TableSinkFacto
 public KuduTableSource createTableSource(ObjectPath tablePath, 
CatalogTable table) {
 validateTable(table);
 String tableName = table.toProperties().getOrDefault(KUDU_TABLE, 
tablePath.getObjectName());
-return createTableSource(tableName, table.getSchema(), 
table.getProperties());
+return createTableSource(tableName, table.getSchema(), 
table.getOptions());
 }
 
 private KuduTableSource createTableSource(String tableName, TableSchema 
schema, Map props) {
diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/tab

(flink-connector-kudu) 24/44: [BAHIR-296] Unify log4j libs

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 7700edb6a18d08cf2f93f84d4377807af1c826bd
Author: Joao Boto 
AuthorDate: Sun Dec 26 19:39:56 2021 +0100

[BAHIR-296] Unify log4j libs
---
 flink-connector-kudu/pom.xml | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index ac6cdc5..12fd9da 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -104,19 +104,19 @@
 
   org.apache.logging.log4j
   log4j-api
-  ${log4j2.version}
+  ${log4j.version}
test
 
 
   org.apache.logging.log4j
   log4j-core
-  ${log4j2.version}
+  ${log4j.version}
test
 
 
   org.apache.logging.log4j
   log4j-slf4j-impl
-  ${log4j2.version}
+  ${log4j.version}
   test
 
 



(flink-connector-kudu) 14/44: Fix "the client has already been closed" bug (#75)

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 9b99c1862cad3c26b0d287c1fb13e15b588b1392
Author: WangJ1an <826204...@qq.com>
AuthorDate: Mon Jan 20 04:20:34 2020 +0800

Fix "the client has already been closed" bug (#75)

Fix the problem "Caused by: java.lang.IllegalStateException:
Cannot proceed, the client has already been closed" when
running in Flink local mode
---
 .../java/org/apache/flink/connectors/kudu/batch/KuduInputFormat.java | 1 +
 1 file changed, 1 insertion(+)

diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/batch/KuduInputFormat.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/batch/KuduInputFormat.java
index 3a35e6a..98877d8 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/batch/KuduInputFormat.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/batch/KuduInputFormat.java
@@ -98,6 +98,7 @@ public class KuduInputFormat extends 
RichInputFormat {
 }
 if (kuduReader != null) {
 kuduReader.close();
+kuduReader = null;
 }
 }
 



(flink-connector-kudu) 15/44: Fix NotSerializableException in Kudu connector (#74)

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 6da6ad7ad427b23392d3bcbbd43391c05846ad5e
Author: Darcy <331046...@qq.com>
AuthorDate: Mon Jan 20 04:22:46 2020 +0800

Fix NotSerializableException in Kudu connector (#74)
---
 .../org/apache/flink/connectors/kudu/connector/KuduFilterInfo.java | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduFilterInfo.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduFilterInfo.java
index c37bc9a..c7ae4a4 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduFilterInfo.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduFilterInfo.java
@@ -21,10 +21,11 @@ import org.apache.kudu.ColumnSchema;
 import org.apache.kudu.Schema;
 import org.apache.kudu.client.KuduPredicate;
 
+import java.io.Serializable;
 import java.util.List;
 
 @PublicEvolving
-public class KuduFilterInfo {
+public class KuduFilterInfo implements Serializable {
 
 private String column;
 private FilterType type;



(flink-connector-kudu) 04/44: [BAHIR-179] Fail Docker integration tests silently

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 701fabeeb897f13cfcde2e55c8a035c1589e74fc
Author: eskabetxe 
AuthorDate: Wed Jan 9 17:29:32 2019 +0100

[BAHIR-179] Fail Docker integration tests silently

When running docker based integration tests locally,
fail silentily if env requirements not available.

Closes #38
Closes #35
---
 flink-connector-kudu/pom.xml   | 42 --
 .../streaming/connectors/kudu/DockerTest.java  | 31 
 .../connectors/kudu/KuduInputFormatTest.java   |  4 +--
 .../connectors/kudu/KuduOuputFormatTest.java   |  3 +-
 .../streaming/connectors/kudu/KuduSinkTest.java|  2 +-
 .../src/test/resources/log4j.properties| 27 ++
 6 files changed, 67 insertions(+), 42 deletions(-)

diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index 348371b..61ab4a6 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -31,7 +31,8 @@
 
   
 1.7.1
-5.2.0
+
+!DockerTest
   
 
   
@@ -58,45 +59,14 @@
   test
 
 
-
-  org.junit.jupiter
-  junit-jupiter-api
-  ${junit.version}
-  test
-
-
   
 
   
 
-  default
-  
-true
-  
-  
-
-  
-org.apache.maven.plugins
-maven-surefire-plugin
-
-  
-**/*Test.java
-  
-
-  
-
-  
-
-
-  test-kudu
-  
-
-  
-org.apache.maven.plugins
-maven-surefire-plugin
-  
-
-  
+  docker-test
+  
+DockerTest
+  
 
   
 
diff --git 
a/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/DockerTest.java
 
b/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/DockerTest.java
new file mode 100644
index 000..070e634
--- /dev/null
+++ 
b/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/DockerTest.java
@@ -0,0 +1,31 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.flink.streaming.connectors.kudu;
+
+import org.junit.jupiter.api.Tag;
+
+import java.lang.annotation.ElementType;
+import java.lang.annotation.Retention;
+import java.lang.annotation.RetentionPolicy;
+import java.lang.annotation.Target;
+
+@Target({ ElementType.TYPE, ElementType.METHOD })
+@Retention(RetentionPolicy.RUNTIME)
+@Tag("DockerTest")
+public @interface DockerTest {
+}
+
diff --git 
a/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/KuduInputFormatTest.java
 
b/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/KuduInputFormatTest.java
index 8cfc102..eb9dc00 100644
--- 
a/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/KuduInputFormatTest.java
+++ 
b/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/KuduInputFormatTest.java
@@ -26,11 +26,9 @@ import java.io.IOException;
 import java.util.ArrayList;
 import java.util.List;
 
+@DockerTest
 public class KuduInputFormatTest extends KuduDatabase {
 
-
-
-
 @Test
 public void testInvalidKuduMaster() throws IOException {
 KuduTableInfo tableInfo = booksTableInfo("books",false);
diff --git 
a/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/KuduOuputFormatTest.java
 
b/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/KuduOuputFormatTest.java
index 6eb5ebe..e282185 100644
--- 
a/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/KuduOuputFormatTest.java
+++ 
b/flink-connector-kudu/src/test/java/org/apache/flink/streaming/connectors/kudu/KuduOuputFormatTest.java
@@ -26,10 +26,9 @@ import java.io.IOException;
 import java.util.List;
 import java.util.UUID;
 
+@DockerTest
 public class KuduOuputFormatTest extends KuduDatabase {
 
-
-
 @Test
 public void te

(flink-connector-kudu) 22/44: [BAHIR-293] Fix documentation tables

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 03f031288d7f97338e55a8534e59bc473bcfdea1
Author: Joao Boto 
AuthorDate: Tue Dec 7 11:09:26 2021 +0100

[BAHIR-293] Fix documentation tables
---
 flink-connector-kudu/README.md | 25 +
 1 file changed, 13 insertions(+), 12 deletions(-)

diff --git a/flink-connector-kudu/README.md b/flink-connector-kudu/README.md
index 6370aa6..a0e0234 100644
--- a/flink-connector-kudu/README.md
+++ b/flink-connector-kudu/README.md
@@ -154,18 +154,19 @@ The example uses lambda expressions to implement the 
functional interfaces.
 Read more about Kudu schema design in the [Kudu 
docs](https://kudu.apache.org/docs/schema_design.html).
 
 ### Supported data types
-| Flink/SQL | Kudu   | 
-| - |:-:| 
-|STRING | STRING| 
-| BOOLEAN   |BOOL   | 
-| TINYINT   |   INT8| 
-| SMALLINT  |  INT16| 
-| INT   |  INT32| 
-| BIGINT|   INT64 |
-| FLOAT |  FLOAT  |
-| DOUBLE|DOUBLE|
-| BYTES|BINARY|
-| TIMESTAMP(3) |UNIXTIME_MICROS |
+
+| Flink/SQL| Kudu|
+|--|:---:|
+| `STRING` | STRING  |
+| `BOOLEAN`| BOOL|
+| `TINYINT`| INT8|
+| `SMALLINT`   | INT16   |
+| `INT`| INT32   |
+| `BIGINT` | INT64   |
+| `FLOAT`  | FLOAT   |
+| `DOUBLE` | DOUBLE  |
+| `BYTES`  | BINARY  |
+| `TIMESTAMP(3)`   | UNIXTIME_MICROS |
 
 Note:
 * `TIMESTAMP`s are fixed to a precision of 3, and the corresponding Java 
conversion class is `java.sql.Timestamp` 



(flink-connector-kudu) 02/44: [FLINK-34930] Remove Python CI

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 663ffd1a45b9e675058d708ef4f45546736a985f
Author: Ferenc Csaky 
AuthorDate: Thu Apr 11 20:53:49 2024 +0200

[FLINK-34930] Remove Python CI
---
 .github/workflows/push_pr.yml | 7 ---
 1 file changed, 7 deletions(-)

diff --git a/.github/workflows/push_pr.yml b/.github/workflows/push_pr.yml
index 20a..72b98a2 100644
--- a/.github/workflows/push_pr.yml
+++ b/.github/workflows/push_pr.yml
@@ -36,10 +36,3 @@ jobs:
 with:
   flink_version: ${{ matrix.flink }}
   jdk_version: ${{ matrix.jdk }}
-  python_test:
-strategy:
-  matrix:
-flink: [ 1.17.2, 1.18.1, 1.19-SNAPSHOT ]
-uses: 
apache/flink-connector-shared-utils/.github/workflows/python_ci.yml@ci_utils
-with:
-  flink_version: ${{ matrix.flink }}
\ No newline at end of file



(flink-connector-kudu) 07/44: [BAHIR-199] Bump kudu version to 1.9.0

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 1e26505253818da3a5176281f7c5264978e1c71e
Author: Joao Boto 
AuthorDate: Thu Mar 7 21:51:06 2019 +0100

[BAHIR-199] Bump kudu version to 1.9.0

Closes #48
---
 flink-connector-kudu/README.md | 2 +-
 flink-connector-kudu/pom.xml   | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/flink-connector-kudu/README.md b/flink-connector-kudu/README.md
index af2985b..9b75aa7 100644
--- a/flink-connector-kudu/README.md
+++ b/flink-connector-kudu/README.md
@@ -9,7 +9,7 @@ following dependency to your project:
   1.1-SNAPSHOT
 
 
-*Version Compatibility*: This module is compatible with Apache Kudu *1.7.1* 
(last stable version).
+*Version Compatibility*: This module is compatible with Apache Kudu *1.9.0* 
(last stable version).
 
 Note that the streaming connectors are not part of the binary distribution of 
Flink. You need to link them into your job jar for cluster execution.
 See how to link with them for cluster execution 
[here](https://ci.apache.org/projects/flink/flink-docs-stable/start/dependencies.html).
diff --git a/flink-connector-kudu/pom.xml b/flink-connector-kudu/pom.xml
index 61ab4a6..d51341b 100644
--- a/flink-connector-kudu/pom.xml
+++ b/flink-connector-kudu/pom.xml
@@ -30,7 +30,7 @@
   jar
 
   
-1.7.1
+1.9.0
 
 !DockerTest
   



(flink-connector-kudu) 13/44: Add support "upsert part of columns of a kudu table" (#70)

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit c380e400d4c81c83dabf7ff42f4a5bb193492704
Author: Darcy <331046...@qq.com>
AuthorDate: Sat Nov 30 04:02:56 2019 +0800

Add support "upsert part of columns of a kudu table" (#70)

Sometimes we don't want to upsert all columns of a kudu table.
So we need to support the function that upsert part of columns
of a kudu table.
---
 .../flink/connectors/kudu/connector/KuduRow.java   |  4 
 .../kudu/connector/writer/KuduWriter.java  |  5 -
 .../connectors/kudu/batch/KuduOuputFormatTest.java |  2 ++
 .../connectors/kudu/connector/KuduDatabase.java| 26 --
 .../connectors/kudu/streaming/KuduSinkTest.java|  3 ++-
 5 files changed, 36 insertions(+), 4 deletions(-)

diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduRow.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduRow.java
index af78361..78e6e6e 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduRow.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/KuduRow.java
@@ -41,6 +41,10 @@ public class KuduRow extends Row {
 return super.getField(rowNames.get(name));
 }
 
+public boolean hasField(String name) {
+return rowNames.get(name) != null;
+}
+
 public void setField(int pos, String name, Object value) {
 super.setField(pos, value);
 this.rowNames.put(name, pos);
diff --git 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/writer/KuduWriter.java
 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/writer/KuduWriter.java
index f4e2a8a..57c0741 100644
--- 
a/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/writer/KuduWriter.java
+++ 
b/flink-connector-kudu/src/main/java/org/apache/flink/connectors/kudu/connector/writer/KuduWriter.java
@@ -152,7 +152,10 @@ public class KuduWriter implements AutoCloseable {
 
 table.getSchema().getColumns().forEach(column -> {
 String columnName = column.getName();
-Object value = row.getField(column.getName());
+if (!row.hasField(columnName)) {
+return;
+}
+Object value = row.getField(columnName);
 
 if (value == null) {
 partialRow.setNull(columnName);
diff --git 
a/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/batch/KuduOuputFormatTest.java
 
b/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/batch/KuduOuputFormatTest.java
index 963a8c0..f14eaa0 100644
--- 
a/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/batch/KuduOuputFormatTest.java
+++ 
b/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/batch/KuduOuputFormatTest.java
@@ -71,6 +71,7 @@ class KuduOuputFormatTest extends KuduDatabase {
 
 List rows = readRows(tableInfo);
 Assertions.assertEquals(5, rows.size());
+kuduRowsTest(rows);
 
 cleanDatabase(tableInfo);
 }
@@ -99,6 +100,7 @@ class KuduOuputFormatTest extends KuduDatabase {
 
 List rows = readRows(tableInfo);
 Assertions.assertEquals(5, rows.size());
+kuduRowsTest(rows);
 
 cleanDatabase(tableInfo);
 }
diff --git 
a/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/connector/KuduDatabase.java
 
b/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/connector/KuduDatabase.java
index 3d02a1d..cda8c21 100644
--- 
a/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/connector/KuduDatabase.java
+++ 
b/flink-connector-kudu/src/test/java/org/apache/flink/connectors/kudu/connector/KuduDatabase.java
@@ -56,14 +56,16 @@ public class KuduDatabase {
 .addColumn(KuduColumnInfo.Builder.create("id", 
Type.INT32).key(true).hashKey(true).build())
 .addColumn(KuduColumnInfo.Builder.create("title", 
Type.STRING).build())
 .addColumn(KuduColumnInfo.Builder.create("author", 
Type.STRING).build())
-.addColumn(KuduColumnInfo.Builder.create("price", 
Type.DOUBLE).build())
-.addColumn(KuduColumnInfo.Builder.create("quantity", 
Type.INT32).build())
+.addColumn(KuduColumnInfo.Builder.create("price", 
Type.DOUBLE).asNullable().build())
+.addColumn(KuduColumnInfo.Builder.create("quantity", 
Type.INT32).asNullable().build())
 .build();
 }
 
 protected static List booksDataRow() {
 return Arrays.stream(booksTableData)
 .map(row -> {
+

(flink-connector-kudu) 03/44: [BAHIR-99] kudu connector

2024-05-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit 37a3e55b5253027c5f475ef5e3c7c8faa42ffebe
Author: Joao Boto 
AuthorDate: Wed Jul 25 20:17:36 2018 +0200

[BAHIR-99] kudu connector
---
 flink-connector-kudu/README.md |  98 +
 flink-connector-kudu/dockers/docker-compose.yml|  92 +
 flink-connector-kudu/dockers/role/Dockerfile   |  41 
 .../dockers/role/docker-entrypoint.sh  |  69 +++
 flink-connector-kudu/dockers/run_kudu_tests.sh |  68 +++
 flink-connector-kudu/dockers/start-images.sh   |  42 
 flink-connector-kudu/dockers/stop-images.sh|  33 
 flink-connector-kudu/pom.xml   | 103 ++
 .../streaming/connectors/kudu/KuduInputFormat.java | 218 +
 .../connectors/kudu/KuduOutputFormat.java  | 110 +++
 .../flink/streaming/connectors/kudu/KuduSink.java  | 106 ++
 .../connectors/kudu/connector/KuduColumnInfo.java  | 161 +++
 .../connectors/kudu/connector/KuduConnector.java   | 133 +
 .../connectors/kudu/connector/KuduFilterInfo.java  | 173 
 .../connectors/kudu/connector/KuduMapper.java  | 146 ++
 .../connectors/kudu/connector/KuduRow.java | 137 +
 .../connectors/kudu/connector/KuduTableInfo.java   | 133 +
 .../connectors/kudu/KuduInputFormatTest.java   |  91 +
 .../connectors/kudu/KuduOuputFormatTest.java   |  93 +
 .../streaming/connectors/kudu/KuduSinkTest.java|  89 +
 .../connectors/kudu/connector/KuduDatabase.java|  89 +
 21 files changed, 2225 insertions(+)

diff --git a/flink-connector-kudu/README.md b/flink-connector-kudu/README.md
new file mode 100644
index 000..af2985b
--- /dev/null
+++ b/flink-connector-kudu/README.md
@@ -0,0 +1,98 @@
+# Flink Kudu Connector
+
+This connector provides a source (```KuduInputFormat```) and a sink/output 
(```KuduSink``` and ```KuduOutputFormat```, respectively) that can read and 
write to [Kudu](https://kudu.apache.org/). To use this connector, add the
+following dependency to your project:
+
+
+  org.apache.bahir
+  flink-connector-kudu_2.11
+  1.1-SNAPSHOT
+
+
+*Version Compatibility*: This module is compatible with Apache Kudu *1.7.1* 
(last stable version).
+
+Note that the streaming connectors are not part of the binary distribution of 
Flink. You need to link them into your job jar for cluster execution.
+See how to link with them for cluster execution 
[here](https://ci.apache.org/projects/flink/flink-docs-stable/start/dependencies.html).
+
+## Installing Kudu
+
+Follow the instructions from the [Kudu Installation 
Guide](https://kudu.apache.org/docs/installation.html).
+Optionally, you can use the docker images provided in dockers folder. 
+
+## KuduInputFormat
+
+```
+ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
+
+env.setParallelism(PARALLELISM);
+
+// create a table info object
+KuduTableInfo tableInfo = KuduTableInfo.Builder
+.create("books")
+.addColumn(KuduColumnInfo.Builder.create("id", 
Type.INT32).key(true).hashKey(true).build())
+.addColumn(KuduColumnInfo.Builder.create("title", Type.STRING).build())
+.addColumn(KuduColumnInfo.Builder.create("author", 
Type.STRING).build())
+.addColumn(KuduColumnInfo.Builder.create("price", Type.DOUBLE).build())
+.addColumn(KuduColumnInfo.Builder.create("quantity", 
Type.INT32).build())
+.build();
+
+// Pass the tableInfo to the KuduInputFormat and provide kuduMaster ips
+env.createInput(new KuduInputFormat<>("172.25.0.6", tableInfo))
+.count();
+
+env.execute();
+```
+
+## KuduOutputFormat
+
+```
+ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
+
+env.setParallelism(PARALLELISM);
+
+// create a table info object
+KuduTableInfo tableInfo = KuduTableInfo.Builder
+.create("books")
+.createIfNotExist(true)
+.replicas(1)
+.addColumn(KuduColumnInfo.Builder.create("id", 
Type.INT32).key(true).hashKey(true).build())
+.addColumn(KuduColumnInfo.Builder.create("title", Type.STRING).build())
+.addColumn(KuduColumnInfo.Builder.create("author", 
Type.STRING).build())
+.addColumn(KuduColumnInfo.Builder.create("price", Type.DOUBLE).build())
+.addColumn(KuduColumnInfo.Builder.create("quantity", 
Type.INT32).build())
+.build();
+
+...
+
+env.fromCollection(books)
+.output(new KuduOutputFormat<>("172.25.0.6", tableInfo));
+
+env.execute();
+```
+
+## KuduSink
+
+```
+StreamExecutionEnvironment en

(flink-connector-hbase) branch main updated: [FLINK-35233] Fix lookup cache reuse RowData object problem (#47)

2024-04-29 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-hbase.git


The following commit(s) were added to refs/heads/main by this push:
 new b7850a5  [FLINK-35233] Fix lookup cache reuse RowData object problem 
(#47)
b7850a5 is described below

commit b7850a54d9f3958cc67e5985283ce077b98d973b
Author: Tan-JiaLiang 
AuthorDate: Mon Apr 29 22:35:22 2024 +0800

[FLINK-35233] Fix lookup cache reuse RowData object problem (#47)

* fix: convertToReusedRow() is now returned by default, and the result 
returned is a reused object. If lookup.cache is enabled, the result 
encapsulated in the reused object will be cached externally, resulting in all 
cached values being the same object

* [FLINK-35233] Fix lookup cache reuse RowData object problem

-

Co-authored-by: xiekunyuan 
---
 .../connector/hbase2/HBaseConnectorITCase.java | 39 ++
 .../hbase/source/HBaseRowDataLookupFunction.java   |  2 +-
 2 files changed, 33 insertions(+), 8 deletions(-)

diff --git 
a/flink-connector-hbase-2.2/src/test/java/org/apache/flink/connector/hbase2/HBaseConnectorITCase.java
 
b/flink-connector-hbase-2.2/src/test/java/org/apache/flink/connector/hbase2/HBaseConnectorITCase.java
index a946edd..9a2736e 100644
--- 
a/flink-connector-hbase-2.2/src/test/java/org/apache/flink/connector/hbase2/HBaseConnectorITCase.java
+++ 
b/flink-connector-hbase-2.2/src/test/java/org/apache/flink/connector/hbase2/HBaseConnectorITCase.java
@@ -48,6 +48,8 @@ import org.apache.hadoop.hbase.TableName;
 import org.apache.hadoop.hbase.TableNotFoundException;
 import org.apache.hadoop.hbase.util.Bytes;
 import org.junit.jupiter.api.Test;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.EnumSource;
 
 import java.io.IOException;
 import java.util.ArrayList;
@@ -591,14 +593,16 @@ class HBaseConnectorITCase extends HBaseTestBase {
 assertThat(result).isEqualTo(expected);
 }
 
-@Test
-void testHBaseLookupTableSource() {
-verifyHBaseLookupJoin(false);
+@ParameterizedTest
+@EnumSource(Caching.class)
+void testHBaseLookupTableSource(Caching caching) {
+verifyHBaseLookupJoin(caching, false);
 }
 
-@Test
-void testHBaseAsyncLookupTableSource() {
-verifyHBaseLookupJoin(true);
+@ParameterizedTest
+@EnumSource(Caching.class)
+void testHBaseAsyncLookupTableSource(Caching caching) {
+verifyHBaseLookupJoin(caching, true);
 }
 
 @Test
@@ -661,10 +665,22 @@ class HBaseConnectorITCase extends HBaseTestBase {
 sinkFunction.close();
 }
 
-private void verifyHBaseLookupJoin(boolean async) {
+private void verifyHBaseLookupJoin(Caching caching, boolean async) {
 StreamExecutionEnvironment execEnv = 
StreamExecutionEnvironment.getExecutionEnvironment();
 StreamTableEnvironment tEnv = StreamTableEnvironment.create(execEnv, 
streamSettings);
 
+String cacheOptions = "";
+if (caching == Caching.ENABLE_CACHE) {
+cacheOptions =
+","
++ String.join(
+",",
+Arrays.asList(
+"'lookup.cache' = 'PARTIAL'",
+"'lookup.partial-cache.max-rows' = 
'1000'",
+
"'lookup.partial-cache.expire-after-write' = '10min'"));
+}
+
 tEnv.executeSql(
 "CREATE TABLE "
 + TEST_TABLE_1
@@ -686,6 +702,7 @@ class HBaseConnectorITCase extends HBaseTestBase {
 + " 'zookeeper.quorum' = '"
 + getZookeeperQuorum()
 + "'"
++ cacheOptions
 + ")");
 
 // prepare a source table
@@ -722,6 +739,8 @@ class HBaseConnectorITCase extends HBaseTestBase {
 .collect(Collectors.toList());
 
 List expected = new ArrayList<>();
+expected.add(
+"+I[1, 1, 10, Hello-1, 100, 1.01, false, Welt-1, 
2019-08-18T19:00, 2019-08-18, 19:00, 12345678.0001]");
 expected.add(
 "+I[1, 1, 10, Hello-1, 100, 1.01, false, Welt-1, 
2019-08-18T19:00, 2019-08-18, 19:00, 12345678.0001]");
 expected.add(
@@ -750,6 +769,12 @@ class HBaseConnectorITCase extends HBaseTestBase {
 testData.add(Row.of(2, 2L, "Hello"));
 testData.add(Row.of(3, 2L, "Hello world"));
 testData.add(Row.of(3, 3L, "Hello world!"));
+testData.add(Row.of(1, 1L, "Hi"));

(flink-connector-kafka) branch v3.1 updated: [FLINK-35038] Bump `org.yaml:snakeyaml` to `2.2` (#93)

2024-04-11 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch v3.1
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


The following commit(s) were added to refs/heads/v3.1 by this push:
 new ad798fc5 [FLINK-35038] Bump `org.yaml:snakeyaml` to `2.2` (#93)
ad798fc5 is described below

commit ad798fc5387ba3582f92516697d60d0f523e86cb
Author: Ufuk Celebi <1756620+...@users.noreply.github.com>
AuthorDate: Thu Apr 11 13:47:46 2024 +0200

[FLINK-35038] Bump `org.yaml:snakeyaml` to `2.2` (#93)

* Bump org.yaml:snakeyaml from 1.31 to 2.0 in /flink-connector-kafka

Bumps [org.yaml:snakeyaml](https://bitbucket.org/snakeyaml/snakeyaml) from 
1.31 to 2.0.
- 
[Commits](https://bitbucket.org/snakeyaml/snakeyaml/branches/compare/snakeyaml-2.0..snakeyaml-1.31)

---
updated-dependencies:
- dependency-name: org.yaml:snakeyaml
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] 

* [FLINK-X] Bump org.yaml:snakeyaml from 2.0 to 2.2

* [FLINK-35038] Fix SnakeYAML usage after version bump

SnakeYAML introduced breaking API changes and behavior changes between 1.31 
and
2.2. This commit uses the updated APIs and explicitly allows the global tag
for StreamMetadata (see changed SnakeYAML behavior in [1]).

[1] 
https://github.com/snakeyaml/snakeyaml/commit/2b8d47c8bcfd402e7a682b7b2674e8d0cb25e522

-

Signed-off-by: dependabot[bot] 
Co-authored-by: dependabot[bot] 
<49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Martijn Visser 
<2989614+martijnvis...@users.noreply.github.com>
(cherry picked from commit 369e7be46a70fd50d68746498aed82105741e7d6)
---
 flink-connector-kafka/pom.xml   |  2 +-
 .../connector/kafka/testutils/YamlFileMetadataService.java  | 13 ++---
 2 files changed, 11 insertions(+), 4 deletions(-)

diff --git a/flink-connector-kafka/pom.xml b/flink-connector-kafka/pom.xml
index 529d9252..2fa9d9ca 100644
--- a/flink-connector-kafka/pom.xml
+++ b/flink-connector-kafka/pom.xml
@@ -174,7 +174,7 @@ under the License.
 
 org.yaml
 snakeyaml
-1.31
+2.2
 test
 
 
diff --git 
a/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
 
b/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
index 524f7243..4a1dab17 100644
--- 
a/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
+++ 
b/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
@@ -29,6 +29,7 @@ import org.apache.kafka.clients.CommonClientConfigs;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 import org.yaml.snakeyaml.DumperOptions;
+import org.yaml.snakeyaml.LoaderOptions;
 import org.yaml.snakeyaml.TypeDescription;
 import org.yaml.snakeyaml.Yaml;
 import org.yaml.snakeyaml.constructor.Constructor;
@@ -252,19 +253,25 @@ public class YamlFileMetadataService implements 
KafkaMetadataService {
 }
 
 private static Yaml initYamlParser() {
-Representer representer = new Representer();
+DumperOptions dumperOptions = new DumperOptions();
+Representer representer = new Representer(dumperOptions);
 representer.addClassTag(StreamMetadata.class, Tag.MAP);
 TypeDescription typeDescription = new 
TypeDescription(StreamMetadata.class);
 representer.addTypeDescription(typeDescription);
 representer.setDefaultFlowStyle(DumperOptions.FlowStyle.BLOCK);
-return new Yaml(new ListConstructor<>(StreamMetadata.class), 
representer);
+LoaderOptions loaderOptions = new LoaderOptions();
+// Allow global tag for StreamMetadata
+loaderOptions.setTagInspector(
+tag -> 
tag.getClassName().equals(StreamMetadata.class.getName()));
+return new Yaml(new ListConstructor<>(StreamMetadata.class, 
loaderOptions), representer);
 }
 
 /** A custom constructor is required to read yaml lists at the root. */
 private static class ListConstructor extends Constructor {
 private final Class clazz;
 
-public ListConstructor(final Class clazz) {
+public ListConstructor(final Class clazz, final LoaderOptions 
loaderOptions) {
+super(loaderOptions);
 this.clazz = clazz;
 }
 



(flink-connector-kafka) branch main updated: [FLINK-35038] Bump `org.yaml:snakeyaml` to `2.2` (#93)

2024-04-11 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


The following commit(s) were added to refs/heads/main by this push:
 new 369e7be4 [FLINK-35038] Bump `org.yaml:snakeyaml` to `2.2` (#93)
369e7be4 is described below

commit 369e7be46a70fd50d68746498aed82105741e7d6
Author: Ufuk Celebi <1756620+...@users.noreply.github.com>
AuthorDate: Thu Apr 11 13:47:46 2024 +0200

[FLINK-35038] Bump `org.yaml:snakeyaml` to `2.2` (#93)

* Bump org.yaml:snakeyaml from 1.31 to 2.0 in /flink-connector-kafka

Bumps [org.yaml:snakeyaml](https://bitbucket.org/snakeyaml/snakeyaml) from 
1.31 to 2.0.
- 
[Commits](https://bitbucket.org/snakeyaml/snakeyaml/branches/compare/snakeyaml-2.0..snakeyaml-1.31)

---
updated-dependencies:
- dependency-name: org.yaml:snakeyaml
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] 

* [FLINK-X] Bump org.yaml:snakeyaml from 2.0 to 2.2

* [FLINK-35038] Fix SnakeYAML usage after version bump

SnakeYAML introduced breaking API changes and behavior changes between 1.31 
and
2.2. This commit uses the updated APIs and explicitly allows the global tag
for StreamMetadata (see changed SnakeYAML behavior in [1]).

[1] 
https://github.com/snakeyaml/snakeyaml/commit/2b8d47c8bcfd402e7a682b7b2674e8d0cb25e522

-

Signed-off-by: dependabot[bot] 
Co-authored-by: dependabot[bot] 
<49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Martijn Visser 
<2989614+martijnvis...@users.noreply.github.com>
---
 flink-connector-kafka/pom.xml   |  2 +-
 .../connector/kafka/testutils/YamlFileMetadataService.java  | 13 ++---
 2 files changed, 11 insertions(+), 4 deletions(-)

diff --git a/flink-connector-kafka/pom.xml b/flink-connector-kafka/pom.xml
index 529d9252..2fa9d9ca 100644
--- a/flink-connector-kafka/pom.xml
+++ b/flink-connector-kafka/pom.xml
@@ -174,7 +174,7 @@ under the License.
 
 org.yaml
 snakeyaml
-1.31
+2.2
 test
 
 
diff --git 
a/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
 
b/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
index 524f7243..4a1dab17 100644
--- 
a/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
+++ 
b/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
@@ -29,6 +29,7 @@ import org.apache.kafka.clients.CommonClientConfigs;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 import org.yaml.snakeyaml.DumperOptions;
+import org.yaml.snakeyaml.LoaderOptions;
 import org.yaml.snakeyaml.TypeDescription;
 import org.yaml.snakeyaml.Yaml;
 import org.yaml.snakeyaml.constructor.Constructor;
@@ -252,19 +253,25 @@ public class YamlFileMetadataService implements 
KafkaMetadataService {
 }
 
 private static Yaml initYamlParser() {
-Representer representer = new Representer();
+DumperOptions dumperOptions = new DumperOptions();
+Representer representer = new Representer(dumperOptions);
 representer.addClassTag(StreamMetadata.class, Tag.MAP);
 TypeDescription typeDescription = new 
TypeDescription(StreamMetadata.class);
 representer.addTypeDescription(typeDescription);
 representer.setDefaultFlowStyle(DumperOptions.FlowStyle.BLOCK);
-return new Yaml(new ListConstructor<>(StreamMetadata.class), 
representer);
+LoaderOptions loaderOptions = new LoaderOptions();
+// Allow global tag for StreamMetadata
+loaderOptions.setTagInspector(
+tag -> 
tag.getClassName().equals(StreamMetadata.class.getName()));
+return new Yaml(new ListConstructor<>(StreamMetadata.class, 
loaderOptions), representer);
 }
 
 /** A custom constructor is required to read yaml lists at the root. */
 private static class ListConstructor extends Constructor {
 private final Class clazz;
 
-public ListConstructor(final Class clazz) {
+public ListConstructor(final Class clazz, final LoaderOptions 
loaderOptions) {
+super(loaderOptions);
 this.clazz = clazz;
 }
 



(flink-connector-kafka) 01/02: [hotfix] Add Java 21 as JDK to test for Flink 1.19.0 weekly runs

2024-04-11 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch v3.1
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git

commit bb2a9fc3634098f555f492732731ad1286166466
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Thu Apr 11 12:09:22 2024 +0200

[hotfix] Add Java 21 as JDK to test for Flink 1.19.0 weekly runs

(cherry picked from commit 2f5e3cf1e43ee38d68e47c5c72ec0fd83a904295)
---
 .github/workflows/weekly.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/.github/workflows/weekly.yml b/.github/workflows/weekly.yml
index f24769ae..1d909468 100644
--- a/.github/workflows/weekly.yml
+++ b/.github/workflows/weekly.yml
@@ -51,7 +51,7 @@ jobs:
 }, {
   flink: 1.19.0,
   branch: v3.1,
-  jdk: '8, 11, 17',
+  jdk: '8, 11, 17, 21',
 }]
 uses: apache/flink-connector-shared-utils/.github/workflows/ci.yml@ci_utils
 with:



(flink-connector-kafka) branch v3.1 updated (809cb078 -> 4168d0f2)

2024-04-11 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch v3.1
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


from 809cb078 [FLINK-35007] Add support for Flink 1.19 (#90)
 new bb2a9fc3 [hotfix] Add Java 21 as JDK to test for Flink 1.19.0 weekly 
runs
 new 4168d0f2 [FLINK-35008] Bump org.apache.commons:commons-compress from 
1.25.0 to 1.26.1 (#87)

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .github/workflows/weekly.yml | 2 +-
 pom.xml  | 9 -
 2 files changed, 9 insertions(+), 2 deletions(-)



(flink-connector-kafka) 02/02: [FLINK-35008] Bump org.apache.commons:commons-compress from 1.25.0 to 1.26.1 (#87)

2024-04-11 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch v3.1
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git

commit 4168d0f22f2fb6b696b5e09d7b8d1f99a6714b78
Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
AuthorDate: Thu Apr 11 13:43:59 2024 +0200

[FLINK-35008] Bump org.apache.commons:commons-compress from 1.25.0 to 
1.26.1 (#87)

* [FLINK-35008] Bump org.apache.commons:commons-compress from 1.25.0 to 
1.26.1

Bumps org.apache.commons:commons-compress from 1.25.0 to 1.26.1.

---
updated-dependencies:
- dependency-name: org.apache.commons:commons-compress
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] 

* [FLINK-35008] Address dependency convergence due to bumped 
commons-compress

-

Signed-off-by: dependabot[bot] 
Co-authored-by: dependabot[bot] 
<49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Martijn Visser 
<2989614+martijnvis...@users.noreply.github.com>
(cherry picked from commit 1c39e3b7495640c9b3784ec672097741c072cebb)
---
 pom.xml | 9 -
 1 file changed, 8 insertions(+), 1 deletion(-)

diff --git a/pom.xml b/pom.xml
index 15543767..907bd67a 100644
--- a/pom.xml
+++ b/pom.xml
@@ -424,7 +424,14 @@ under the License.
 
 org.apache.commons
 commons-compress
-1.25.0
+1.26.1
+
+
+
+
+commons-io
+commons-io
+2.15.1
 
 
 



(flink-connector-kafka) branch main updated: [FLINK-35008] Bump org.apache.commons:commons-compress from 1.25.0 to 1.26.1 (#87)

2024-04-11 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


The following commit(s) were added to refs/heads/main by this push:
 new 1c39e3b7 [FLINK-35008] Bump org.apache.commons:commons-compress from 
1.25.0 to 1.26.1 (#87)
1c39e3b7 is described below

commit 1c39e3b7495640c9b3784ec672097741c072cebb
Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
AuthorDate: Thu Apr 11 13:43:59 2024 +0200

[FLINK-35008] Bump org.apache.commons:commons-compress from 1.25.0 to 
1.26.1 (#87)

* [FLINK-35008] Bump org.apache.commons:commons-compress from 1.25.0 to 
1.26.1

Bumps org.apache.commons:commons-compress from 1.25.0 to 1.26.1.

---
updated-dependencies:
- dependency-name: org.apache.commons:commons-compress
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] 

* [FLINK-35008] Address dependency convergence due to bumped 
commons-compress

-

Signed-off-by: dependabot[bot] 
Co-authored-by: dependabot[bot] 
<49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Martijn Visser 
<2989614+martijnvis...@users.noreply.github.com>
---
 pom.xml | 9 -
 1 file changed, 8 insertions(+), 1 deletion(-)

diff --git a/pom.xml b/pom.xml
index 15543767..907bd67a 100644
--- a/pom.xml
+++ b/pom.xml
@@ -424,7 +424,14 @@ under the License.
 
 org.apache.commons
 commons-compress
-1.25.0
+1.26.1
+
+
+
+
+commons-io
+commons-io
+2.15.1
 
 
 



(flink-connector-kafka) branch v3.1 updated: [FLINK-35007] Add support for Flink 1.19 (#90)

2024-04-11 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch v3.1
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


The following commit(s) were added to refs/heads/v3.1 by this push:
 new 809cb078 [FLINK-35007] Add support for Flink 1.19 (#90)
809cb078 is described below

commit 809cb0786565b3515d1a17319b0f98f59b1ef6c2
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Thu Apr 11 10:15:31 2024 +0200

[FLINK-35007] Add support for Flink 1.19 (#90)

* [FLINK-35007] Add support for Flink 1.19

This also includes dropping the weekly tests for the `v3.0` branch, since 
`v3.1` has been released and that's our main version going forward.

* [FLINK-35007] Remove unused test class that relied on removed Internal 
class

* [FLINK-35007][ci] Copy old `flink-conf.yaml` to make sure that all Python 
tests work for Flink 1.x releases

(cherry picked from commit 897001d5682a0708042d59be81a10485ffd0dde7)
---
 .github/workflows/push_pr.yml  |   4 +-
 .github/workflows/weekly.yml   |  17 +-
 .../connectors/kafka/testutils/DataGenerators.java |  29 --
 flink-python/dev/integration_test.sh   |  18 ++
 flink-python/pyflink/flink-conf.yaml   | 311 +
 5 files changed, 344 insertions(+), 35 deletions(-)

diff --git a/.github/workflows/push_pr.yml b/.github/workflows/push_pr.yml
index 00e2f788..7f30c691 100644
--- a/.github/workflows/push_pr.yml
+++ b/.github/workflows/push_pr.yml
@@ -30,7 +30,7 @@ jobs:
 include:
   - flink: 1.18.1
 jdk: '8, 11, 17'
-  - flink: 1.19-SNAPSHOT
+  - flink: 1.19.0
 jdk: '8, 11, 17, 21'
 uses: apache/flink-connector-shared-utils/.github/workflows/ci.yml@ci_utils
 with:
@@ -39,7 +39,7 @@ jobs:
   python_test:
 strategy:
   matrix:
-flink: [ 1.17.2, 1.18.1 ]
+flink: [ 1.17.2, 1.18.1, 1.19.0 ]
 uses: 
apache/flink-connector-shared-utils/.github/workflows/python_ci.yml@ci_utils
 with:
   flink_version: ${{ matrix.flink }}
\ No newline at end of file
diff --git a/.github/workflows/weekly.yml b/.github/workflows/weekly.yml
index e6bf27dd..f24769ae 100644
--- a/.github/workflows/weekly.yml
+++ b/.github/workflows/weekly.yml
@@ -38,11 +38,20 @@ jobs:
   jdk: '8, 11, 17, 21',
   branch: main
 }, {
-  flink: 1.17.1,
-  branch: v3.0
+  flink: 1.20-SNAPSHOT,
+  jdk: '8, 11, 17, 21',
+  branch: main
+}, {
+  flink: 1.17.2,
+  branch: v3.1
 }, {
-  flink: 1.18.0,
-  branch: v3.0
+  flink: 1.18.1,
+  jdk: '8, 11, 17',
+  branch: v3.1
+}, {
+  flink: 1.19.0,
+  branch: v3.1,
+  jdk: '8, 11, 17',
 }]
 uses: apache/flink-connector-shared-utils/.github/workflows/ci.yml@ci_utils
 with:
diff --git 
a/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/testutils/DataGenerators.java
 
b/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/testutils/DataGenerators.java
index 83ee3fb1..d660bd2f 100644
--- 
a/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/testutils/DataGenerators.java
+++ 
b/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/testutils/DataGenerators.java
@@ -18,16 +18,13 @@
 
 package org.apache.flink.streaming.connectors.kafka.testutils;
 
-import org.apache.flink.api.common.JobExecutionResult;
 import org.apache.flink.api.common.restartstrategy.RestartStrategies;
 import org.apache.flink.api.common.serialization.SimpleStringSchema;
 import 
org.apache.flink.api.common.serialization.TypeInformationSerializationSchema;
 import org.apache.flink.api.common.typeinfo.BasicTypeInfo;
-import org.apache.flink.api.dag.Transformation;
 import org.apache.flink.streaming.api.datastream.DataStream;
 import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
 import 
org.apache.flink.streaming.api.functions.source.RichParallelSourceFunction;
-import org.apache.flink.streaming.api.graph.StreamGraph;
 import org.apache.flink.streaming.api.operators.StreamSink;
 import org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducerBase;
 import org.apache.flink.streaming.connectors.kafka.KafkaTestEnvironment;
@@ -36,8 +33,6 @@ import 
org.apache.flink.streaming.connectors.kafka.partitioner.FlinkKafkaPartiti
 import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
 import org.apache.flink.streaming.util.OneInputStreamOperatorTestHarness;
 
-import java.util.Collections;
-import java.util.List;
 import java.util.Properties;
 import java.util.Random;
 
@@ -210,29 +205,5 @@ public class DataGenerators {
 public Throwable getError() {
 

(flink-connector-kafka) branch dependabot/maven/org.apache.commons-commons-compress-1.26.0 updated (edba5a2b -> 0e8bcbec)

2024-04-11 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch 
dependabot/maven/org.apache.commons-commons-compress-1.26.0
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


from edba5a2b [FLINK-35008] Bump org.apache.commons:commons-compress from 
1.25.0 to 1.26.1
 add 0e8bcbec [FLINK-35008] Address dependency convergence due to bumped 
commons-compress

No new revisions were added by this update.

Summary of changes:
 pom.xml | 7 +++
 1 file changed, 7 insertions(+)



(flink-connector-kafka) branch dependabot/maven/org.apache.commons-commons-compress-1.26.0 updated (57a63919 -> edba5a2b)

2024-04-11 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch 
dependabot/maven/org.apache.commons-commons-compress-1.26.0
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


 discard 57a63919 [FLINK-35008] Address dependency convergence due to bumped 
commons-compress
 discard 84823602 [FLINK-35008] Bump org.apache.commons:commons-compress from 
1.25.0 to 1.26.0
 add 897001d5 [FLINK-35007] Add support for Flink 1.19 (#90)
 add 2f5e3cf1 [hotfix] Add Java 21 as JDK to test for Flink 1.19.0 weekly 
runs
 add edba5a2b [FLINK-35008] Bump org.apache.commons:commons-compress from 
1.25.0 to 1.26.1

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (57a63919)
\
 N -- N -- N   
refs/heads/dependabot/maven/org.apache.commons-commons-compress-1.26.0 
(edba5a2b)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

No new revisions were added by this update.

Summary of changes:
 .github/workflows/push_pr.yml  |   4 +-
 .github/workflows/weekly.yml   |  12 +-
 .../connectors/kafka/testutils/DataGenerators.java |  29 --
 flink-python/dev/integration_test.sh   |  18 ++
 flink-python/pyflink/flink-conf.yaml   | 311 +
 pom.xml|   9 +-
 6 files changed, 339 insertions(+), 44 deletions(-)
 create mode 100644 flink-python/pyflink/flink-conf.yaml



(flink-connector-kafka) branch main updated: [hotfix] Add Java 21 as JDK to test for Flink 1.19.0 weekly runs

2024-04-11 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


The following commit(s) were added to refs/heads/main by this push:
 new 2f5e3cf1 [hotfix] Add Java 21 as JDK to test for Flink 1.19.0 weekly 
runs
2f5e3cf1 is described below

commit 2f5e3cf1e43ee38d68e47c5c72ec0fd83a904295
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Thu Apr 11 12:09:22 2024 +0200

[hotfix] Add Java 21 as JDK to test for Flink 1.19.0 weekly runs
---
 .github/workflows/weekly.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/.github/workflows/weekly.yml b/.github/workflows/weekly.yml
index f24769ae..1d909468 100644
--- a/.github/workflows/weekly.yml
+++ b/.github/workflows/weekly.yml
@@ -51,7 +51,7 @@ jobs:
 }, {
   flink: 1.19.0,
   branch: v3.1,
-  jdk: '8, 11, 17',
+  jdk: '8, 11, 17, 21',
 }]
 uses: apache/flink-connector-shared-utils/.github/workflows/ci.yml@ci_utils
 with:



(flink-connector-kafka) branch main updated: [FLINK-35007] Add support for Flink 1.19 (#90)

2024-04-11 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


The following commit(s) were added to refs/heads/main by this push:
 new 897001d5 [FLINK-35007] Add support for Flink 1.19 (#90)
897001d5 is described below

commit 897001d5682a0708042d59be81a10485ffd0dde7
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Thu Apr 11 10:15:31 2024 +0200

[FLINK-35007] Add support for Flink 1.19 (#90)

* [FLINK-35007] Add support for Flink 1.19

This also includes dropping the weekly tests for the `v3.0` branch, since 
`v3.1` has been released and that's our main version going forward.

* [FLINK-35007] Remove unused test class that relied on removed Internal 
class

* [FLINK-35007][ci] Copy old `flink-conf.yaml` to make sure that all Python 
tests work for Flink 1.x releases
---
 .github/workflows/push_pr.yml  |   4 +-
 .github/workflows/weekly.yml   |  12 +-
 .../connectors/kafka/testutils/DataGenerators.java |  29 --
 flink-python/dev/integration_test.sh   |  18 ++
 flink-python/pyflink/flink-conf.yaml   | 311 +
 5 files changed, 338 insertions(+), 36 deletions(-)

diff --git a/.github/workflows/push_pr.yml b/.github/workflows/push_pr.yml
index 20a0..7f30c691 100644
--- a/.github/workflows/push_pr.yml
+++ b/.github/workflows/push_pr.yml
@@ -30,7 +30,7 @@ jobs:
 include:
   - flink: 1.18.1
 jdk: '8, 11, 17'
-  - flink: 1.19-SNAPSHOT
+  - flink: 1.19.0
 jdk: '8, 11, 17, 21'
 uses: apache/flink-connector-shared-utils/.github/workflows/ci.yml@ci_utils
 with:
@@ -39,7 +39,7 @@ jobs:
   python_test:
 strategy:
   matrix:
-flink: [ 1.17.2, 1.18.1, 1.19-SNAPSHOT ]
+flink: [ 1.17.2, 1.18.1, 1.19.0 ]
 uses: 
apache/flink-connector-shared-utils/.github/workflows/python_ci.yml@ci_utils
 with:
   flink_version: ${{ matrix.flink }}
\ No newline at end of file
diff --git a/.github/workflows/weekly.yml b/.github/workflows/weekly.yml
index aaa729fd..f24769ae 100644
--- a/.github/workflows/weekly.yml
+++ b/.github/workflows/weekly.yml
@@ -37,6 +37,10 @@ jobs:
   flink: 1.19-SNAPSHOT,
   jdk: '8, 11, 17, 21',
   branch: main
+}, {
+  flink: 1.20-SNAPSHOT,
+  jdk: '8, 11, 17, 21',
+  branch: main
 }, {
   flink: 1.17.2,
   branch: v3.1
@@ -45,11 +49,9 @@ jobs:
   jdk: '8, 11, 17',
   branch: v3.1
 }, {
-  flink: 1.17.2,
-  branch: v3.0
-}, {
-  flink: 1.18.1,
-  branch: v3.0
+  flink: 1.19.0,
+  branch: v3.1,
+  jdk: '8, 11, 17',
 }]
 uses: apache/flink-connector-shared-utils/.github/workflows/ci.yml@ci_utils
 with:
diff --git 
a/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/testutils/DataGenerators.java
 
b/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/testutils/DataGenerators.java
index 83ee3fb1..d660bd2f 100644
--- 
a/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/testutils/DataGenerators.java
+++ 
b/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/testutils/DataGenerators.java
@@ -18,16 +18,13 @@
 
 package org.apache.flink.streaming.connectors.kafka.testutils;
 
-import org.apache.flink.api.common.JobExecutionResult;
 import org.apache.flink.api.common.restartstrategy.RestartStrategies;
 import org.apache.flink.api.common.serialization.SimpleStringSchema;
 import 
org.apache.flink.api.common.serialization.TypeInformationSerializationSchema;
 import org.apache.flink.api.common.typeinfo.BasicTypeInfo;
-import org.apache.flink.api.dag.Transformation;
 import org.apache.flink.streaming.api.datastream.DataStream;
 import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
 import 
org.apache.flink.streaming.api.functions.source.RichParallelSourceFunction;
-import org.apache.flink.streaming.api.graph.StreamGraph;
 import org.apache.flink.streaming.api.operators.StreamSink;
 import org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducerBase;
 import org.apache.flink.streaming.connectors.kafka.KafkaTestEnvironment;
@@ -36,8 +33,6 @@ import 
org.apache.flink.streaming.connectors.kafka.partitioner.FlinkKafkaPartiti
 import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
 import org.apache.flink.streaming.util.OneInputStreamOperatorTestHarness;
 
-import java.util.Collections;
-import java.util.List;
 import java.util.Properties;
 import java.util.Random;
 
@@ -210,29 +205,5 @@ public class DataGenerators {
 public Throwable getError() {
 return this

(flink-connector-kafka) branch dependabot/maven/org.apache.commons-commons-compress-1.26.0 updated (d4d46ad4 -> 57a63919)

2024-04-04 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch 
dependabot/maven/org.apache.commons-commons-compress-1.26.0
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


 discard d4d46ad4 Bump org.apache.commons:commons-compress from 1.25.0 to 1.26.0
 add 84823602 [FLINK-35008] Bump org.apache.commons:commons-compress from 
1.25.0 to 1.26.0
 add 57a63919 [FLINK-35008] Address dependency convergence due to bumped 
commons-compress

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (d4d46ad4)
\
 N -- N -- N   
refs/heads/dependabot/maven/org.apache.commons-commons-compress-1.26.0 
(57a63919)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

No new revisions were added by this update.

Summary of changes:
 pom.xml | 7 +++
 1 file changed, 7 insertions(+)



(flink-connector-kudu) branch main created (now d837674)

2024-03-26 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git


  at d837674  Initial commit

This branch includes the following new commits:

 new d837674  Initial commit

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.




(flink-connector-kudu) 01/01: Initial commit

2024-03-26 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kudu.git

commit d837674efce30153c021a394adc043dce03b9ff4
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Tue Mar 26 09:58:39 2024 +0100

Initial commit
---
 .asf.yaml |  21 +
 .github/boring-cyborg.yml |  87 ++
 .github/workflows/push_pr.yml |  45 ++
 .github/workflows/weekly.yml  |  59 +
 .gitignore|  54 
 LICENSE   | 201 ++
 NOTICE|  16 
 README.md |  68 ++
 8 files changed, 551 insertions(+)

diff --git a/.asf.yaml b/.asf.yaml
new file mode 100644
index 000..9cf67c2
--- /dev/null
+++ b/.asf.yaml
@@ -0,0 +1,21 @@
+github:
+  enabled_merge_buttons:
+squash: true
+merge: false
+rebase: true
+  labels:
+- flink
+- kudu
+- connector
+- datastream
+- table
+- sql
+  autolink_jira: FLINK
+  collaborators:
+- flinkbot
+notifications:
+  commits:  commits@flink.apache.org
+  issues:   iss...@flink.apache.org
+  pullrequests: iss...@flink.apache.org
+  jobs: bui...@flink.apache.org
+  jira_options: link label
\ No newline at end of file
diff --git a/.github/boring-cyborg.yml  b/.github/boring-cyborg.yml 
new file mode 100644
index 000..321785b
--- /dev/null
+++ b/.github/boring-cyborg.yml 
@@ -0,0 +1,87 @@
+
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#  http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+labelPRBasedOnFilePath:
+  component=BuildSystem:
+- .github/**/*
+- tools/maven/*
+
+  component=Documentation:
+- docs/**/*
+
+  component=Connectors/Kudu:
+- flink-connector-kudu*/**/*
+- flink-sql-connector-kudu*/**/*
+
+## IssueLink Adder 
#
+# Insert Issue (Jira/Github etc) link in PR description based on the Issue ID 
in PR title.
+insertIssueLinkInPrDescription:
+  # specify the placeholder for the issue link that should be present in the 
description
+  descriptionIssuePlaceholderRegexp: "^Issue link: (.*)$"
+  matchers:
+# you can have several matches - for different types of issues
+# only the first matching entry is replaced
+jiraIssueMatch:
+  # specify the regexp of issue id that you can find in the title of the PR
+  # the match groups can be used to build the issue id (${1}, ${2}, etc.).
+  titleIssueIdRegexp: \[(FLINK-[0-9]+)\]
+  # the issue link to be added. ${1}, ${2} ... are replaced with the match 
groups from the
+  # title match (remember to use quotes)
+  descriptionIssueLink: 
"[${1}](https://issues.apache.org/jira/browse/${1}/)"
+docOnlyIssueMatch:
+  titleIssueIdRegexp: \[hotfix\]
+  descriptionIssueLink: "`Documentation only change, no JIRA issue`"
+
+## Title Validator 
#
+# Verifies if commit/PR titles match the regexp specified
+verifyTitles:
+  # Regular expression that should be matched by titles of commits or PR
+  titleRegexp: ^\[FLINK-[0-9]+\].*$|^\[FLINK-X\].*$|^\[hotfix].*$
+  # If set to true, it will always check the PR title (as opposed to the 
individual commits).
+  alwaysUsePrTitle: false
+  # If set to true, it will only check the commit in case there is a single 
commit.
+  # In case of multiple commits it will check PR title.
+  # This reflects the standard behaviour of Github that for `Squash & Merge` 
GitHub
+  # uses the PR title rather than commit messages for the squashed commit 
¯\_(ツ)_/¯
+  # For single-commit PRs it takes the squashed commit message from the commit 
as expected.
+  #
+  # If set to false it will check all commit messa

(flink-connector-hbase) branch main updated: [FLINK-34621] Bump com.google.guava:guava in flink-connector-hbase-base. This closes #43

2024-03-07 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-hbase.git


The following commit(s) were added to refs/heads/main by this push:
 new 3e7bb17  [FLINK-34621] Bump com.google.guava:guava in 
flink-connector-hbase-base. This closes #43
3e7bb17 is described below

commit 3e7bb17e2489f4ddbb93061d72f726b33e24f22c
Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
AuthorDate: Thu Mar 7 12:20:44 2024 -0800

[FLINK-34621] Bump com.google.guava:guava in flink-connector-hbase-base. 
This closes #43

Bumps [com.google.guava:guava](https://github.com/google/guava) from 
31.1-jre to 32.0.0-jre.
- [Release notes](https://github.com/google/guava/releases)
- [Commits](https://github.com/google/guava/commits)

---
updated-dependencies:
- dependency-name: com.google.guava:guava
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] 
Co-authored-by: dependabot[bot] 
<49699333+dependabot[bot]@users.noreply.github.com>
---
 flink-connector-hbase-base/pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/flink-connector-hbase-base/pom.xml 
b/flink-connector-hbase-base/pom.xml
index e1c55ec..f50d309 100644
--- a/flink-connector-hbase-base/pom.xml
+++ b/flink-connector-hbase-base/pom.xml
@@ -166,7 +166,7 @@ under the License.

com.google.guava
guava
-   31.1-jre
+   32.0.0-jre
test

 



(flink-connector-hbase) branch dependabot/maven/org.apache.zookeeper-zookeeper-3.7.2 updated (248ba58 -> fa5dd68)

2024-03-07 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch 
dependabot/maven/org.apache.zookeeper-zookeeper-3.7.2
in repository https://gitbox.apache.org/repos/asf/flink-connector-hbase.git


from 248ba58  Bump org.apache.zookeeper:zookeeper from 3.4.14 to 3.7.2
 add fa5dd68  Exclude org.apache.yetus:audience-annotations

No new revisions were added by this update.

Summary of changes:
 pom.xml | 6 ++
 1 file changed, 6 insertions(+)



(flink-connector-hbase) branch main updated (9cbc109 -> dfe7646)

2024-03-06 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-hbase.git


from 9cbc109  [FLINK-34413] Remove HBase 1.x connector files and deps. This 
closes #42
 add dfe7646  [FLINK-34575] Bump org.apache.commons:commons-compress from 
1.23.0 to 1.26.0. This closes #41

No new revisions were added by this update.

Summary of changes:
 pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



(flink-connector-hbase) branch main updated (91d166d -> 9cbc109)

2024-03-06 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-hbase.git


from 91d166d  [hotfix] Update copyright year to 2024
 add 9cbc109  [FLINK-34413] Remove HBase 1.x connector files and deps. This 
closes #42

No new revisions were added by this update.

Summary of changes:
 docs/content.zh/docs/connectors/table/hbase.md |3 +-
 docs/content/docs/connectors/table/hbase.md|3 +-
 docs/data/hbase.yml|2 -
 .../3ba92d3a-6609-4295-92ed-f2fe207ee2b3   |0
 .../ffbddcc3-857a-4af7-a6b5-fcf71e2cc191   |0
 .../archunit-violations/stored.rules   |4 -
 flink-connector-hbase-1.4/pom.xml  |  443 --
 .../hbase1/HBase1DynamicTableFactory.java  |  186 ---
 .../hbase1/sink/HBaseDynamicTableSink.java |  143 --
 .../hbase1/source/AbstractTableInputFormat.java|  313 
 .../hbase1/source/HBaseDynamicTableSource.java |   78 -
 .../hbase1/source/HBaseRowDataInputFormat.java |   96 --
 .../org.apache.flink.table.factories.Factory   |   16 -
 .../architecture/TestCodeArchitectureTest.java |   40 -
 .../connector/hbase1/HBaseConnectorITCase.java |  758 --
 .../hbase1/HBaseDynamicTableFactoryTest.java   |  348 -
 .../flink/connector/hbase1/HBaseTablePlanTest.java |  138 --
 .../flink/connector/hbase1/util/HBaseTestBase.java |  307 
 .../util/HBaseTestingClusterAutoStarter.java   |  193 ---
 .../java/org/slf4j/impl/Log4jLoggerAdapter.java|   22 -
 .../src/test/resources/archunit.properties |   31 -
 .../src/test/resources/hbase-site.xml  |   29 -
 .../src/test/resources/log4j2-test.properties  |   28 -
 .../flink/connector/hbase1/HBaseTablePlanTest.xml  |   36 -
 flink-connector-hbase-base/pom.xml |   10 +-
 flink-connector-hbase-e2e-tests/pom.xml|   15 -
 .../apache/flink/streaming/tests/HBaseITCase.java  |   24 +-
 flink-sql-connector-hbase-1.4/pom.xml  |  157 --
 .../src/main/resources/META-INF/NOTICE |   63 -
 .../resources/META-INF/licenses/LICENSE.protobuf   |   32 -
 .../src/main/resources/hbase-default.xml   | 1558 
 pom.xml|   25 +-
 32 files changed, 40 insertions(+), 5061 deletions(-)
 delete mode 100644 
flink-connector-hbase-1.4/archunit-violations/3ba92d3a-6609-4295-92ed-f2fe207ee2b3
 delete mode 100644 
flink-connector-hbase-1.4/archunit-violations/ffbddcc3-857a-4af7-a6b5-fcf71e2cc191
 delete mode 100644 flink-connector-hbase-1.4/archunit-violations/stored.rules
 delete mode 100644 flink-connector-hbase-1.4/pom.xml
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/HBase1DynamicTableFactory.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/sink/HBaseDynamicTableSink.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/source/AbstractTableInputFormat.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/source/HBaseDynamicTableSource.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/source/HBaseRowDataInputFormat.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/resources/META-INF/services/org.apache.flink.table.factories.Factory
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/flink/architecture/TestCodeArchitectureTest.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/flink/connector/hbase1/HBaseConnectorITCase.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/flink/connector/hbase1/HBaseDynamicTableFactoryTest.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/flink/connector/hbase1/HBaseTablePlanTest.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/flink/connector/hbase1/util/HBaseTestBase.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/flink/connector/hbase1/util/HBaseTestingClusterAutoStarter.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/slf4j/impl/Log4jLoggerAdapter.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/resources/archunit.properties
 delete mode 100644 flink-connector-hbase-1.4/src/test/resources/hbase-site.xml
 delete mode 100644 
flink-connector-hbase-1.4/src/test/resources/log4j2-test.properties
 delete mode 100644 
flink-connector-hbase-1.4/src/test/resources/org/apache/flink/connector/hbase1/HBaseTablePlanTest.xml
 delete mode 100644 flink-sql-connector-hbase-1.4/pom.xml
 delete mode 100644 
flink-sql-connector-hbase-1.4/src/main/resources/META-INF/NOTICE
 delete mode 100644 
flink-sql-connector-hbase-1.4

(flink-connector-jdbc) branch main updated: [FLINK-32714] Add dialect for OceanBase database. This closes #72

2024-03-01 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-jdbc.git


The following commit(s) were added to refs/heads/main by this push:
 new 786bd159 [FLINK-32714] Add dialect for OceanBase database. This closes 
#72
786bd159 is described below

commit 786bd15951887a98f3a4962ba95e4e99b7214d7a
Author: He Wang 
AuthorDate: Fri Mar 1 16:21:17 2024 +0800

[FLINK-32714] Add dialect for OceanBase database. This closes #72
---
 docs/content.zh/docs/connectors/table/jdbc.md  |  83 +++--
 docs/content/docs/connectors/table/jdbc.md |  70 ++-
 flink-connector-jdbc/pom.xml   |   8 ++
 .../flink/connector/jdbc/catalog/JdbcCatalog.java  |  17 ++-
 .../connector/jdbc/catalog/JdbcCatalogUtils.java   |   5 +-
 .../jdbc/catalog/factory/JdbcCatalogFactory.java   |   5 +-
 .../catalog/factory/JdbcCatalogFactoryOptions.java |   3 +
 .../oceanbase/dialect/OceanBaseDialect.java| 121 +++
 .../oceanbase/dialect/OceanBaseDialectFactory.java |  46 +++
 .../oceanbase/dialect/OceanBaseRowConverter.java   | 134 +
 .../connector/jdbc/dialect/JdbcDialectFactory.java |  15 +++
 .../connector/jdbc/dialect/JdbcDialectLoader.java  |   9 +-
 .../options/InternalJdbcConnectionOptions.java |   9 +-
 .../connector/jdbc/table/JdbcConnectorOptions.java |   6 +
 .../jdbc/table/JdbcDynamicTableFactory.java|  24 +++-
 ...flink.connector.jdbc.dialect.JdbcDialectFactory |   1 +
 .../oceanbase/OceanBaseMysqlTestBase.java  |  42 +++
 .../oceanbase/OceanBaseOracleTestBase.java |  44 +++
 .../oceanbase/dialect/OceanBaseDialectTest.java|  52 
 .../dialect/OceanBaseMysqlDialectTypeTest.java |  77 
 .../dialect/OceanBaseOracleDialectTypeTest.java|  71 +++
 .../OceanBaseMySqlDynamicTableSinkITCase.java  |  90 ++
 .../OceanBaseMySqlDynamicTableSourceITCase.java|  74 
 .../OceanBaseOracleDynamicTableSinkITCase.java | 121 +++
 .../OceanBaseOracleDynamicTableSourceITCase.java   |  91 ++
 .../oceanbase/table/OceanBaseTableRow.java |  48 
 .../catalog/factory/JdbcCatalogFactoryTest.java|   3 +-
 .../jdbc/dialect/JdbcDialectTypeTest.java  |   4 +-
 .../jdbc/table/JdbcDynamicTableSinkITCase.java |   9 +-
 .../connector/jdbc/table/JdbcOutputFormatTest.java |  35 ++
 .../databases/oceanbase/OceanBaseContainer.java|  74 
 .../databases/oceanbase/OceanBaseDatabase.java |  72 +++
 .../databases/oceanbase/OceanBaseImages.java   |  27 +
 .../databases/oceanbase/OceanBaseMetadata.java |  85 +
 .../databases/oceanbase/OceanBaseTestDatabase.java |  25 
 35 files changed, 1567 insertions(+), 33 deletions(-)

diff --git a/docs/content.zh/docs/connectors/table/jdbc.md 
b/docs/content.zh/docs/connectors/table/jdbc.md
index 20447e16..cf4e6384 100644
--- a/docs/content.zh/docs/connectors/table/jdbc.md
+++ b/docs/content.zh/docs/connectors/table/jdbc.md
@@ -58,7 +58,7 @@ JDBC 连接器不是二进制发行版的一部分,请查阅[这里]({{< ref "
 | CrateDB| `io.crate` | `crate-jdbc`   | 
[下载](https://repo1.maven.org/maven2/io/crate/crate-jdbc/)   
|
 | Db2| `com.ibm.db2.jcc`  | `db2jcc`   | 
[下载](https://www.ibm.com/support/pages/download-db2-fix-packs-version-db2-linux-unix-and-windows)
   |
 | Trino  | `io.trino` | `trino-jdbc`   | 
[下载](https://repo1.maven.org/maven2/io/trino/trino-jdbc/)   
|
-
+| OceanBase  | `com.oceanbase`| `oceanbase-client` | 
[下载](https://repo1.maven.org/maven2/com/oceanbase/oceanbase-client/)
|
 
 当前,JDBC 连接器和驱动不在 Flink 二进制发布包中,请参阅[这里]({{< ref 
"docs/dev/configuration/overview" >}})了解在集群上执行时如何连接它们。
 
@@ -141,6 +141,13 @@ ON myTopic.key = MyUserTable.id;
   String
   用于连接到此 URL 的 JDBC 驱动类名,如果不设置,将自动从 URL 中推导。
 
+
+  compatible-mode
+  可选
+  (none)
+  String
+  数据库的兼容模式。
+
 
   username
   可选
@@ -654,7 +661,7 @@ SELECT * FROM `custom_schema.test_table2`;
 
 数据类型映射
 
-Flink 支持连接到多个使用方言(dialect)的数据库,如 MySQL、Oracle、PostgreSQL、CrateDB, Derby、Db2、 
SQL Server 等。其中,Derby 通常是用于测试目的。下表列出了从关系数据库数据类型到 Flink SQL 数据类型的类型映射,映射表可以使得在 
Flink 中定义 JDBC 表更加简单。
+Flink 支持连接到多个使用方言(dialect)的数据库,如 MySQL、Oracle、PostgreSQL、CrateDB, Derby、Db2、 
SQL Server、OceanBase 等。其中,Derby 通常是用于测试目的。下表列出了从关系数据库数据类型到 Flink SQL 
数据类型的类型映射,映射表可以使得在 Flink 中定义 JDBC 表更加简单。
 
 
 
@@ -666,6 +673,8 @@ Flink 支持连接到多个使用方言(dialect)的数据库,如 MySQL、O
 https://docs.microsoft.com/en-us/sql/t-s

(flink-connector-shared-utils) branch ci_utils updated: [FLINK-34314] Update CI Node Actions from NodeJS 16 to NodeJS 20. This closes (#35)

2024-02-20 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch ci_utils
in repository 
https://gitbox.apache.org/repos/asf/flink-connector-shared-utils.git


The following commit(s) were added to refs/heads/ci_utils by this push:
 new ec54606  [FLINK-34314] Update CI Node Actions from NodeJS 16 to NodeJS 
20. This closes  (#35)
ec54606 is described below

commit ec546068089dac4c4192875b57703989fc3bb009
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Tue Feb 20 10:12:14 2024 +0100

[FLINK-34314] Update CI Node Actions from NodeJS 16 to NodeJS 20. This 
closes  (#35)

* [FLINK-34314] Update CI Node Actions from NodeJS 16 to NodeJS 20

* [FLINK-34314] Update setup-maven also to NodeJS 20 version
---
 .github/workflows/ci.yml | 10 +-
 1 file changed, 5 insertions(+), 5 deletions(-)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index e9d1fcc..418998d 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -77,19 +77,19 @@ jobs:
   - run: echo "Running CI pipeline for JDK version ${{ matrix.jdk }}"
 
   - name: Check out repository code
-uses: actions/checkout@v3
+uses: actions/checkout@v4
 with:
   ref: "${{ inputs.connector_branch }}"
 
   - name: Set JDK
-uses: actions/setup-java@v3
+uses: actions/setup-java@v4
 with:
   java-version: ${{ matrix.jdk }}
   distribution: 'temurin'
   cache: 'maven'
 
   - name: Set Maven 3.8.6
-uses: stCarolas/setup-maven@v4.5
+uses: stCarolas/setup-maven@v5
 with:
   maven-version: 3.8.6
 
@@ -130,7 +130,7 @@ jobs:
 
   - name: Restore cached Flink binary
 if: ${{ env.cache_binary == 'true' }}
-uses: actions/cache/restore@v3
+uses: actions/cache/restore@v4
 id: restore-cache-flink
 with:
   path: ${{ env.FLINK_CACHE_DIR }}
@@ -143,7 +143,7 @@ jobs:
 
   - name: Cache Flink binary
 if: ${{ env.cache_binary == 'true' }}
-uses: actions/cache/save@v3
+uses: actions/cache/save@v4
 id: cache-flink
 with:
   path: ${{ env.FLINK_CACHE_DIR }}



(flink-connector-mongodb) 02/02: [hotfix][test][connectors/mongodb] Update MongoWriterITCase to be compatible with updated SinkV2 interfaces

2024-02-19 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch v1.0
in repository https://gitbox.apache.org/repos/asf/flink-connector-mongodb.git

commit e2aa8b332552853f2e59f7f46af3cbe9f7573741
Author: Jiabao Sun 
AuthorDate: Mon Jan 29 09:50:00 2024 +0800

[hotfix][test][connectors/mongodb] Update MongoWriterITCase to be 
compatible with updated SinkV2 interfaces

This closes #22.
---
 .../mongodb/sink/writer/MongoWriterITCase.java | 33 +++---
 1 file changed, 16 insertions(+), 17 deletions(-)

diff --git 
a/flink-connector-mongodb/src/test/java/org/apache/flink/connector/mongodb/sink/writer/MongoWriterITCase.java
 
b/flink-connector-mongodb/src/test/java/org/apache/flink/connector/mongodb/sink/writer/MongoWriterITCase.java
index 84767e8..bd3ca66 100644
--- 
a/flink-connector-mongodb/src/test/java/org/apache/flink/connector/mongodb/sink/writer/MongoWriterITCase.java
+++ 
b/flink-connector-mongodb/src/test/java/org/apache/flink/connector/mongodb/sink/writer/MongoWriterITCase.java
@@ -17,8 +17,9 @@
 
 package org.apache.flink.connector.mongodb.sink.writer;
 
+import org.apache.flink.connector.base.DeliveryGuarantee;
 import org.apache.flink.connector.base.sink.writer.TestSinkInitContext;
-import org.apache.flink.connector.mongodb.common.config.MongoConnectionOptions;
+import org.apache.flink.connector.mongodb.sink.MongoSink;
 import org.apache.flink.connector.mongodb.sink.config.MongoWriteOptions;
 import org.apache.flink.connector.mongodb.sink.writer.context.MongoSinkContext;
 import 
org.apache.flink.connector.mongodb.sink.writer.serializer.MongoSerializationSchema;
@@ -49,6 +50,7 @@ import org.testcontainers.containers.MongoDBContainer;
 import org.testcontainers.junit.jupiter.Container;
 import org.testcontainers.junit.jupiter.Testcontainers;
 
+import java.io.IOException;
 import java.util.Optional;
 
 import static 
org.apache.flink.connector.mongodb.testutils.MongoTestUtil.assertThatIdsAreNotWritten;
@@ -235,7 +237,7 @@ public class MongoWriterITCase {
 MongoWriteOptions.builder()
 .setBatchSize(batchSize)
 .setBatchIntervalMs(batchIntervalMs)
-.setMaxRetries(0)
+.setDeliveryGuarantee(DeliveryGuarantee.NONE)
 .build();
 
 MongoSerializationSchema testSerializationSchema =
@@ -269,7 +271,8 @@ public class MongoWriterITCase {
 }
 
 private static MongoWriter createWriter(
-String collection, int batchSize, long batchIntervalMs, boolean 
flushOnCheckpoint) {
+String collection, int batchSize, long batchIntervalMs, boolean 
flushOnCheckpoint)
+throws IOException {
 return createWriter(
 collection,
 batchSize,
@@ -283,28 +286,24 @@ public class MongoWriterITCase {
 int batchSize,
 long batchIntervalMs,
 boolean flushOnCheckpoint,
-MongoSerializationSchema serializationSchema) {
+MongoSerializationSchema serializationSchema)
+throws IOException {
 
-MongoConnectionOptions connectionOptions =
-MongoConnectionOptions.builder()
+MongoSink mongoSink =
+MongoSink.builder()
 .setUri(MONGO_CONTAINER.getConnectionString())
 .setDatabase(TEST_DATABASE)
 .setCollection(collection)
-.build();
-
-MongoWriteOptions writeOptions =
-MongoWriteOptions.builder()
 .setBatchSize(batchSize)
 .setBatchIntervalMs(batchIntervalMs)
-.setMaxRetries(0)
+.setDeliveryGuarantee(
+flushOnCheckpoint
+? DeliveryGuarantee.AT_LEAST_ONCE
+: DeliveryGuarantee.NONE)
+.setSerializationSchema(serializationSchema)
 .build();
 
-return new MongoWriter<>(
-connectionOptions,
-writeOptions,
-flushOnCheckpoint,
-sinkInitContext,
-serializationSchema);
+return (MongoWriter) mongoSink.createWriter(sinkInitContext);
 }
 
 private static Document buildMessage(int id) {



(flink-connector-mongodb) 01/02: [FLINK-33899][connectors/mongodb] Support Java 17 and Java 21 for mongodb connector

2024-02-19 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch v1.0
in repository https://gitbox.apache.org/repos/asf/flink-connector-mongodb.git

commit aa7a9094e527fb83524fe681fcf089ffa0a3c794
Author: Jiabao Sun 
AuthorDate: Mon Jan 29 17:18:32 2024 +0800

[FLINK-33899][connectors/mongodb] Support Java 17 and Java 21 for mongodb 
connector

This closes #21.
---
 .github/workflows/push_pr.yml   |  9 -
 .github/workflows/weekly.yml| 13 +++--
 flink-connector-mongodb/pom.xml |  9 +
 pom.xml |  3 +++
 4 files changed, 27 insertions(+), 7 deletions(-)

diff --git a/.github/workflows/push_pr.yml b/.github/workflows/push_pr.yml
index fd4e0af..9d349c0 100644
--- a/.github/workflows/push_pr.yml
+++ b/.github/workflows/push_pr.yml
@@ -25,7 +25,14 @@ jobs:
   compile_and_test:
 strategy:
   matrix:
-flink: [1.16-SNAPSHOT, 1.17-SNAPSHOT, 1.18-SNAPSHOT, 1.19-SNAPSHOT]
+flink: [ 1.16-SNAPSHOT, 1.17-SNAPSHOT ]
+jdk: [ '8, 11' ]
+include:
+  - flink: 1.18-SNAPSHOT
+jdk: '8, 11, 17'
+  - flink: 1.19-SNAPSHOT
+jdk: '8, 11, 17, 21'
 uses: apache/flink-connector-shared-utils/.github/workflows/ci.yml@ci_utils
 with:
   flink_version: ${{ matrix.flink }}
+  jdk_version: ${{ matrix.jdk }}
diff --git a/.github/workflows/weekly.yml b/.github/workflows/weekly.yml
index e2db295..a70c3b8 100644
--- a/.github/workflows/weekly.yml
+++ b/.github/workflows/weekly.yml
@@ -34,25 +34,26 @@ jobs:
   branch: main
 }, {
   flink: 1.18-SNAPSHOT,
+  jdk: '8, 11, 17',
   branch: main
 }, {
   flink: 1.19-SNAPSHOT,
+  jdk: '8, 11, 17, 21',
   branch: main
 }, {
-  flink: 1.16.2,
+  flink: 1.16.3,
   branch: v1.0
 }, {
-  flink: 1.17.1,
-  branch: v1.0
-},{
-  flink: 1.18.0,
+  flink: 1.17.2,
   branch: v1.0
 }, {
-  flink: 1.19-SNAPSHOT,
+  flink: 1.18.1,
+  jdk: '8, 11, 17',
   branch: v1.0
 }]
 uses: apache/flink-connector-shared-utils/.github/workflows/ci.yml@ci_utils
 with:
   flink_version: ${{ matrix.flink_branches.flink }}
   connector_branch: ${{ matrix.flink_branches.branch }}
+  jdk_version: ${{ matrix.flink_branches.jdk || '8, 11' }}
   run_dependency_convergence: false
diff --git a/flink-connector-mongodb/pom.xml b/flink-connector-mongodb/pom.xml
index 06d7d1e..1df7516 100644
--- a/flink-connector-mongodb/pom.xml
+++ b/flink-connector-mongodb/pom.xml
@@ -32,6 +32,15 @@ under the License.
 
jar
 
+   
+   
+   
+   --add-opens=java.base/java.util=ALL-UNNAMED
+   
+   --add-opens=java.base/java.lang=ALL-UNNAMED
+   
+   
+


 
diff --git a/pom.xml b/pom.xml
index ba474ec..f9f05a8 100644
--- a/pom.xml
+++ b/pom.xml
@@ -67,6 +67,9 @@ under the License.
2.17.2
 

flink-connector-mongodb-parent
+   
+   -XX:+UseG1GC -Xms256m 
-XX:+IgnoreUnrecognizedVMOptions 
${surefire.module.config}
+   

 




(flink-connector-mongodb) branch v1.0 updated (78fe3ed -> e2aa8b3)

2024-02-19 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch v1.0
in repository https://gitbox.apache.org/repos/asf/flink-connector-mongodb.git


from 78fe3ed  [FLINK-33567][Documentation] Add Flink compatibility matrix 
for MongoDB 1.0.1
 new aa7a909  [FLINK-33899][connectors/mongodb] Support Java 17 and Java 21 
for mongodb connector
 new e2aa8b3  [hotfix][test][connectors/mongodb] Update MongoWriterITCase 
to be compatible with updated SinkV2 interfaces

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .github/workflows/push_pr.yml  |  9 +-
 .github/workflows/weekly.yml   | 13 +
 flink-connector-mongodb/pom.xml|  9 ++
 .../mongodb/sink/writer/MongoWriterITCase.java | 33 +++---
 pom.xml|  3 ++
 5 files changed, 43 insertions(+), 24 deletions(-)



(flink) branch master updated: [FLINK-34432][Table/Planner] Re-enable fork reuse

2024-02-14 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 403694e7b9c [FLINK-34432][Table/Planner] Re-enable fork reuse
403694e7b9c is described below

commit 403694e7b9c213386f3ed9cff21ce2664030ebc2
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Mon Feb 12 16:18:20 2024 +0100

[FLINK-34432][Table/Planner] Re-enable fork reuse
---
 flink-table/flink-table-planner/pom.xml | 7 ++-
 1 file changed, 2 insertions(+), 5 deletions(-)

diff --git a/flink-table/flink-table-planner/pom.xml 
b/flink-table/flink-table-planner/pom.xml
index c3d0a573c78..8104c865a88 100644
--- a/flink-table/flink-table-planner/pom.xml
+++ b/flink-table/flink-table-planner/pom.xml
@@ -393,11 +393,8 @@ under the License.


**/*ITCase.*

-   
-   
false
+   
+   
true






(flink-connector-kafka) branch dependabot/maven/flink-connector-kafka/org.yaml-snakeyaml-2.0 updated: [FLINK-XXXXX] Bump org.yaml:snakeyaml from 2.0 to 2.2

2024-02-13 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch 
dependabot/maven/flink-connector-kafka/org.yaml-snakeyaml-2.0
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


The following commit(s) were added to 
refs/heads/dependabot/maven/flink-connector-kafka/org.yaml-snakeyaml-2.0 by 
this push:
 new a547a0b7 [FLINK-X] Bump org.yaml:snakeyaml from 2.0 to 2.2
a547a0b7 is described below

commit a547a0b79a61bb033ffd5f5a2661c7b48bc2da3f
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Tue Feb 13 15:19:54 2024 +0100

[FLINK-X] Bump org.yaml:snakeyaml from 2.0 to 2.2
---
 flink-connector-kafka/pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/flink-connector-kafka/pom.xml b/flink-connector-kafka/pom.xml
index 57b505ad..2fa9d9ca 100644
--- a/flink-connector-kafka/pom.xml
+++ b/flink-connector-kafka/pom.xml
@@ -174,7 +174,7 @@ under the License.
 
 org.yaml
 snakeyaml
-2.0
+2.2
 test
 
 



(flink-connector-kafka) branch dependabot/maven/flink-connector-kafka/org.yaml-snakeyaml-2.0 created (now 725727e2)

2024-02-13 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch 
dependabot/maven/flink-connector-kafka/org.yaml-snakeyaml-2.0
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


  at 725727e2 Bump org.yaml:snakeyaml from 1.31 to 2.0 in 
/flink-connector-kafka

No new revisions were added by this update.



(flink-connector-elasticsearch) branch dependabot/maven/org.yaml-snakeyaml-2.0 updated (60c726d -> f769584)

2024-02-13 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch dependabot/maven/org.yaml-snakeyaml-2.0
in repository 
https://gitbox.apache.org/repos/asf/flink-connector-elasticsearch.git


from 60c726d  Bump org.yaml:snakeyaml from 1.31 to 2.0
 add f769584  [FLINK-34435] Bump org.yaml:snakeyaml from 2.0 to 2.2

No new revisions were added by this update.

Summary of changes:
 pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



(flink-connector-kafka) branch main updated: [hotfix] Add JDK21 for 1.19-SNAPSHOT

2024-02-13 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


The following commit(s) were added to refs/heads/main by this push:
 new 15f2662e [hotfix] Add JDK21 for 1.19-SNAPSHOT
15f2662e is described below

commit 15f2662eccf461d9d539ed87a78c9851cd17fa43
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Tue Feb 13 09:50:22 2024 +0100

[hotfix] Add JDK21 for 1.19-SNAPSHOT
---
 .github/workflows/weekly.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/.github/workflows/weekly.yml b/.github/workflows/weekly.yml
index 0aa3bec2..aaa729fd 100644
--- a/.github/workflows/weekly.yml
+++ b/.github/workflows/weekly.yml
@@ -35,7 +35,7 @@ jobs:
   branch: main
 }, {
   flink: 1.19-SNAPSHOT,
-  jdk: '8, 11, 17',
+  jdk: '8, 11, 17, 21',
   branch: main
 }, {
   flink: 1.17.2,



(flink-connector-kafka) branch main updated: [hotfix] Test against Flink 1.19-SNAPSHOT for `main` and Weekly builds

2024-02-13 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


The following commit(s) were added to refs/heads/main by this push:
 new 489dd7be [hotfix] Test against Flink 1.19-SNAPSHOT for `main` and 
Weekly builds
489dd7be is described below

commit 489dd7bebdd89ef69a699b87bb3fada04a04b87f
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Tue Feb 13 09:48:29 2024 +0100

[hotfix] Test against Flink 1.19-SNAPSHOT for `main` and Weekly builds
---
 .github/workflows/push_pr.yml | 2 +-
 .github/workflows/weekly.yml  | 4 
 2 files changed, 5 insertions(+), 1 deletion(-)

diff --git a/.github/workflows/push_pr.yml b/.github/workflows/push_pr.yml
index 00e2f788..20a0 100644
--- a/.github/workflows/push_pr.yml
+++ b/.github/workflows/push_pr.yml
@@ -39,7 +39,7 @@ jobs:
   python_test:
 strategy:
   matrix:
-flink: [ 1.17.2, 1.18.1 ]
+flink: [ 1.17.2, 1.18.1, 1.19-SNAPSHOT ]
 uses: 
apache/flink-connector-shared-utils/.github/workflows/python_ci.yml@ci_utils
 with:
   flink_version: ${{ matrix.flink }}
\ No newline at end of file
diff --git a/.github/workflows/weekly.yml b/.github/workflows/weekly.yml
index 21462eb7..0aa3bec2 100644
--- a/.github/workflows/weekly.yml
+++ b/.github/workflows/weekly.yml
@@ -33,6 +33,10 @@ jobs:
   flink: 1.18-SNAPSHOT,
   jdk: '8, 11, 17',
   branch: main
+}, {
+  flink: 1.19-SNAPSHOT,
+  jdk: '8, 11, 17',
+  branch: main
 }, {
   flink: 1.17.2,
   branch: v3.1



(flink-connector-kafka) 01/02: [FLINK-34192] Update to be compatible with updated SinkV2 interfaces

2024-02-13 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch v3.1
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git

commit 186e72a2a71cda9ca1bd9ae45420b64611c10900
Author: Jiabao Sun 
AuthorDate: Thu Feb 8 23:16:44 2024 +0800

[FLINK-34192] Update to be compatible with updated SinkV2 interfaces

(cherry picked from commit b8328ab55e2bcf026ef82e35cebbb1d867cfb18f)
---
 .github/workflows/push_pr.yml  |   2 +
 flink-connector-kafka/pom.xml  |   4 +
 .../connector/kafka/sink/KafkaWriterITCase.java| 149 ++---
 .../kafka/table/KafkaTableTestUtils.java   |  16 ++-
 4 files changed, 91 insertions(+), 80 deletions(-)

diff --git a/.github/workflows/push_pr.yml b/.github/workflows/push_pr.yml
index d57c0181..00e2f788 100644
--- a/.github/workflows/push_pr.yml
+++ b/.github/workflows/push_pr.yml
@@ -30,6 +30,8 @@ jobs:
 include:
   - flink: 1.18.1
 jdk: '8, 11, 17'
+  - flink: 1.19-SNAPSHOT
+jdk: '8, 11, 17, 21'
 uses: apache/flink-connector-shared-utils/.github/workflows/ci.yml@ci_utils
 with:
   flink_version: ${{ matrix.flink }}
diff --git a/flink-connector-kafka/pom.xml b/flink-connector-kafka/pom.xml
index 40d6a9f3..6510b9c8 100644
--- a/flink-connector-kafka/pom.xml
+++ b/flink-connector-kafka/pom.xml
@@ -144,6 +144,10 @@ under the License.
 org.slf4j
 slf4j-api
 
+
+io.dropwizard.metrics
+metrics-core
+
 
 test
 
diff --git 
a/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java
 
b/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java
index 41c26633..c9eceb98 100644
--- 
a/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java
+++ 
b/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java
@@ -27,9 +27,11 @@ import org.apache.flink.metrics.Counter;
 import org.apache.flink.metrics.Gauge;
 import org.apache.flink.metrics.MetricGroup;
 import org.apache.flink.metrics.groups.OperatorIOMetricGroup;
+import org.apache.flink.metrics.groups.OperatorMetricGroup;
 import org.apache.flink.metrics.groups.SinkWriterMetricGroup;
 import org.apache.flink.metrics.testutils.MetricListener;
 import org.apache.flink.runtime.metrics.groups.InternalSinkWriterMetricGroup;
+import org.apache.flink.runtime.metrics.groups.ProxyMetricGroup;
 import org.apache.flink.runtime.metrics.groups.UnregisteredMetricGroups;
 import org.apache.flink.util.TestLoggerExtension;
 import org.apache.flink.util.UserCodeClassLoader;
@@ -58,7 +60,6 @@ import java.io.IOException;
 import java.nio.ByteBuffer;
 import java.util.ArrayList;
 import java.util.Collection;
-import java.util.Collections;
 import java.util.Comparator;
 import java.util.List;
 import java.util.Optional;
@@ -84,7 +85,7 @@ public class KafkaWriterITCase {
 private static final Network NETWORK = Network.newNetwork();
 private static final String KAFKA_METRIC_WITH_GROUP_NAME = 
"KafkaProducer.incoming-byte-total";
 private static final SinkWriter.Context SINK_WRITER_CONTEXT = new 
DummySinkWriterContext();
-private String topic;
+private static String topic;
 
 private MetricListener metricListener;
 private TriggerTimeService timeService;
@@ -130,11 +131,8 @@ public class KafkaWriterITCase {
 
 @Test
 public void testIncreasingRecordBasedCounters() throws Exception {
-final OperatorIOMetricGroup operatorIOMetricGroup =
-
UnregisteredMetricGroups.createUnregisteredOperatorMetricGroup().getIOMetricGroup();
-final InternalSinkWriterMetricGroup metricGroup =
-InternalSinkWriterMetricGroup.mock(
-metricListener.getMetricGroup(), 
operatorIOMetricGroup);
+final SinkWriterMetricGroup metricGroup = 
createSinkWriterMetricGroup();
+
 try (final KafkaWriter writer =
 createWriterWithConfiguration(
 getKafkaClientConfiguration(), DeliveryGuarantee.NONE, 
metricGroup)) {
@@ -167,13 +165,9 @@ public class KafkaWriterITCase {
 
 @Test
 public void testCurrentSendTimeMetric() throws Exception {
-final InternalSinkWriterMetricGroup metricGroup =
-
InternalSinkWriterMetricGroup.mock(metricListener.getMetricGroup());
 try (final KafkaWriter writer =
 createWriterWithConfiguration(
-getKafkaClientConfiguration(),
-DeliveryGuarantee.AT_LEAST_ONCE,
-metricGroup)) {
+getKafkaClientConfiguration(), 
DeliveryGuarantee.AT_LEAST_ONCE)) {
 fina

(flink-connector-kafka) 02/02: [FLINK-34193] Remove usage of Flink-Shaded Jackson and Snakeyaml in flink-connector-kafka

2024-02-13 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch v3.1
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git

commit 906b6c05db1a06c5684b2826ae79a2e346736aae
Author: Jiabao Sun 
AuthorDate: Thu Feb 8 23:17:23 2024 +0800

[FLINK-34193] Remove usage of Flink-Shaded Jackson and Snakeyaml in 
flink-connector-kafka

(cherry picked from commit 2606a8256d0e25da19ebce4f92cd426b5bf63f7c)
---
 flink-connector-kafka/pom.xml   |  7 +++
 .../kafka/testutils/YamlFileMetadataService.java| 17 -
 2 files changed, 15 insertions(+), 9 deletions(-)

diff --git a/flink-connector-kafka/pom.xml b/flink-connector-kafka/pom.xml
index 6510b9c8..529d9252 100644
--- a/flink-connector-kafka/pom.xml
+++ b/flink-connector-kafka/pom.xml
@@ -171,6 +171,13 @@ under the License.
 test
 
 
+
+org.yaml
+snakeyaml
+1.31
+test
+
+
 
 org.apache.flink
 flink-test-utils
diff --git 
a/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
 
b/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
index 32839f37..524f7243 100644
--- 
a/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
+++ 
b/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
@@ -23,20 +23,19 @@ import 
org.apache.flink.connector.kafka.dynamic.metadata.ClusterMetadata;
 import org.apache.flink.connector.kafka.dynamic.metadata.KafkaMetadataService;
 import org.apache.flink.connector.kafka.dynamic.metadata.KafkaStream;
 
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.DumperOptions;
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.TypeDescription;
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.Yaml;
-import 
org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.constructor.Constructor;
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.nodes.Node;
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.nodes.SequenceNode;
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.nodes.Tag;
-import 
org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.representer.Representer;
-
 import com.google.common.base.MoreObjects;
 import com.google.common.collect.ImmutableMap;
 import org.apache.kafka.clients.CommonClientConfigs;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
+import org.yaml.snakeyaml.DumperOptions;
+import org.yaml.snakeyaml.TypeDescription;
+import org.yaml.snakeyaml.Yaml;
+import org.yaml.snakeyaml.constructor.Constructor;
+import org.yaml.snakeyaml.nodes.Node;
+import org.yaml.snakeyaml.nodes.SequenceNode;
+import org.yaml.snakeyaml.nodes.Tag;
+import org.yaml.snakeyaml.representer.Representer;
 
 import java.io.File;
 import java.io.FileWriter;



(flink-connector-kafka) branch v3.1 updated (927b2e71 -> 906b6c05)

2024-02-13 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch v3.1
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


from 927b2e71 [FLINK-34244] Update Confluent Platform to 7.4.4. This closes 
#81
 new 186e72a2 [FLINK-34192] Update to be compatible with updated SinkV2 
interfaces
 new 906b6c05 [FLINK-34193] Remove usage of Flink-Shaded Jackson and 
Snakeyaml in flink-connector-kafka

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .github/workflows/push_pr.yml  |   2 +
 flink-connector-kafka/pom.xml  |  11 ++
 .../connector/kafka/sink/KafkaWriterITCase.java| 149 ++---
 .../kafka/testutils/YamlFileMetadataService.java   |  17 ++-
 .../kafka/table/KafkaTableTestUtils.java   |  16 ++-
 5 files changed, 106 insertions(+), 89 deletions(-)



(flink-connector-kafka) branch main updated (cfb275b4 -> 2606a825)

2024-02-12 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


from cfb275b4 [FLINK-34244] Update Confluent Platform to 7.4.4. This closes 
#81
 new b8328ab5 [FLINK-34192] Update to be compatible with updated SinkV2 
interfaces
 new 2606a825 [FLINK-34193] Remove usage of Flink-Shaded Jackson and 
Snakeyaml in flink-connector-kafka

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .github/workflows/push_pr.yml  |   2 +
 flink-connector-kafka/pom.xml  |  11 ++
 .../connector/kafka/sink/KafkaWriterITCase.java| 149 ++---
 .../kafka/testutils/YamlFileMetadataService.java   |  17 ++-
 .../kafka/table/KafkaTableTestUtils.java   |  16 ++-
 5 files changed, 106 insertions(+), 89 deletions(-)



(flink-connector-kafka) 02/02: [FLINK-34193] Remove usage of Flink-Shaded Jackson and Snakeyaml in flink-connector-kafka

2024-02-12 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git

commit 2606a8256d0e25da19ebce4f92cd426b5bf63f7c
Author: Jiabao Sun 
AuthorDate: Thu Feb 8 23:17:23 2024 +0800

[FLINK-34193] Remove usage of Flink-Shaded Jackson and Snakeyaml in 
flink-connector-kafka
---
 flink-connector-kafka/pom.xml   |  7 +++
 .../kafka/testutils/YamlFileMetadataService.java| 17 -
 2 files changed, 15 insertions(+), 9 deletions(-)

diff --git a/flink-connector-kafka/pom.xml b/flink-connector-kafka/pom.xml
index 6510b9c8..529d9252 100644
--- a/flink-connector-kafka/pom.xml
+++ b/flink-connector-kafka/pom.xml
@@ -171,6 +171,13 @@ under the License.
 test
 
 
+
+org.yaml
+snakeyaml
+1.31
+test
+
+
 
 org.apache.flink
 flink-test-utils
diff --git 
a/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
 
b/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
index 32839f37..524f7243 100644
--- 
a/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
+++ 
b/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
@@ -23,20 +23,19 @@ import 
org.apache.flink.connector.kafka.dynamic.metadata.ClusterMetadata;
 import org.apache.flink.connector.kafka.dynamic.metadata.KafkaMetadataService;
 import org.apache.flink.connector.kafka.dynamic.metadata.KafkaStream;
 
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.DumperOptions;
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.TypeDescription;
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.Yaml;
-import 
org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.constructor.Constructor;
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.nodes.Node;
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.nodes.SequenceNode;
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.nodes.Tag;
-import 
org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.representer.Representer;
-
 import com.google.common.base.MoreObjects;
 import com.google.common.collect.ImmutableMap;
 import org.apache.kafka.clients.CommonClientConfigs;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
+import org.yaml.snakeyaml.DumperOptions;
+import org.yaml.snakeyaml.TypeDescription;
+import org.yaml.snakeyaml.Yaml;
+import org.yaml.snakeyaml.constructor.Constructor;
+import org.yaml.snakeyaml.nodes.Node;
+import org.yaml.snakeyaml.nodes.SequenceNode;
+import org.yaml.snakeyaml.nodes.Tag;
+import org.yaml.snakeyaml.representer.Representer;
 
 import java.io.File;
 import java.io.FileWriter;



(flink-connector-kafka) 01/02: [FLINK-34192] Update to be compatible with updated SinkV2 interfaces

2024-02-12 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git

commit b8328ab55e2bcf026ef82e35cebbb1d867cfb18f
Author: Jiabao Sun 
AuthorDate: Thu Feb 8 23:16:44 2024 +0800

[FLINK-34192] Update to be compatible with updated SinkV2 interfaces
---
 .github/workflows/push_pr.yml  |   2 +
 flink-connector-kafka/pom.xml  |   4 +
 .../connector/kafka/sink/KafkaWriterITCase.java| 149 ++---
 .../kafka/table/KafkaTableTestUtils.java   |  16 ++-
 4 files changed, 91 insertions(+), 80 deletions(-)

diff --git a/.github/workflows/push_pr.yml b/.github/workflows/push_pr.yml
index d57c0181..00e2f788 100644
--- a/.github/workflows/push_pr.yml
+++ b/.github/workflows/push_pr.yml
@@ -30,6 +30,8 @@ jobs:
 include:
   - flink: 1.18.1
 jdk: '8, 11, 17'
+  - flink: 1.19-SNAPSHOT
+jdk: '8, 11, 17, 21'
 uses: apache/flink-connector-shared-utils/.github/workflows/ci.yml@ci_utils
 with:
   flink_version: ${{ matrix.flink }}
diff --git a/flink-connector-kafka/pom.xml b/flink-connector-kafka/pom.xml
index 40d6a9f3..6510b9c8 100644
--- a/flink-connector-kafka/pom.xml
+++ b/flink-connector-kafka/pom.xml
@@ -144,6 +144,10 @@ under the License.
 org.slf4j
 slf4j-api
 
+
+io.dropwizard.metrics
+metrics-core
+
 
 test
 
diff --git 
a/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java
 
b/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java
index 41c26633..c9eceb98 100644
--- 
a/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java
+++ 
b/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java
@@ -27,9 +27,11 @@ import org.apache.flink.metrics.Counter;
 import org.apache.flink.metrics.Gauge;
 import org.apache.flink.metrics.MetricGroup;
 import org.apache.flink.metrics.groups.OperatorIOMetricGroup;
+import org.apache.flink.metrics.groups.OperatorMetricGroup;
 import org.apache.flink.metrics.groups.SinkWriterMetricGroup;
 import org.apache.flink.metrics.testutils.MetricListener;
 import org.apache.flink.runtime.metrics.groups.InternalSinkWriterMetricGroup;
+import org.apache.flink.runtime.metrics.groups.ProxyMetricGroup;
 import org.apache.flink.runtime.metrics.groups.UnregisteredMetricGroups;
 import org.apache.flink.util.TestLoggerExtension;
 import org.apache.flink.util.UserCodeClassLoader;
@@ -58,7 +60,6 @@ import java.io.IOException;
 import java.nio.ByteBuffer;
 import java.util.ArrayList;
 import java.util.Collection;
-import java.util.Collections;
 import java.util.Comparator;
 import java.util.List;
 import java.util.Optional;
@@ -84,7 +85,7 @@ public class KafkaWriterITCase {
 private static final Network NETWORK = Network.newNetwork();
 private static final String KAFKA_METRIC_WITH_GROUP_NAME = 
"KafkaProducer.incoming-byte-total";
 private static final SinkWriter.Context SINK_WRITER_CONTEXT = new 
DummySinkWriterContext();
-private String topic;
+private static String topic;
 
 private MetricListener metricListener;
 private TriggerTimeService timeService;
@@ -130,11 +131,8 @@ public class KafkaWriterITCase {
 
 @Test
 public void testIncreasingRecordBasedCounters() throws Exception {
-final OperatorIOMetricGroup operatorIOMetricGroup =
-
UnregisteredMetricGroups.createUnregisteredOperatorMetricGroup().getIOMetricGroup();
-final InternalSinkWriterMetricGroup metricGroup =
-InternalSinkWriterMetricGroup.mock(
-metricListener.getMetricGroup(), 
operatorIOMetricGroup);
+final SinkWriterMetricGroup metricGroup = 
createSinkWriterMetricGroup();
+
 try (final KafkaWriter writer =
 createWriterWithConfiguration(
 getKafkaClientConfiguration(), DeliveryGuarantee.NONE, 
metricGroup)) {
@@ -167,13 +165,9 @@ public class KafkaWriterITCase {
 
 @Test
 public void testCurrentSendTimeMetric() throws Exception {
-final InternalSinkWriterMetricGroup metricGroup =
-
InternalSinkWriterMetricGroup.mock(metricListener.getMetricGroup());
 try (final KafkaWriter writer =
 createWriterWithConfiguration(
-getKafkaClientConfiguration(),
-DeliveryGuarantee.AT_LEAST_ONCE,
-metricGroup)) {
+getKafkaClientConfiguration(), 
DeliveryGuarantee.AT_LEAST_ONCE)) {
 final Optional> currentSendTime =
 metricList

(flink-connector-kafka) branch v3.1 updated: [FLINK-34244] Update Confluent Platform to 7.4.4. This closes #81

2024-02-08 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch v3.1
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


The following commit(s) were added to refs/heads/v3.1 by this push:
 new 927b2e71 [FLINK-34244] Update Confluent Platform to 7.4.4. This closes 
#81
927b2e71 is described below

commit 927b2e71bf6cd5e5b8db90c58b91bf145510da58
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Fri Feb 9 08:52:36 2024 +0100

[FLINK-34244] Update Confluent Platform to 7.4.4. This closes #81

* Make sure that all tests use the central DockerImageVersions

* Update Confluent Platform to 7.4.4

(cherry picked from commit cfb275b478ff97e9105c5ffaf20224f59a89ebd7)
---
 .../java/org/apache/flink/tests/util/kafka/KafkaSinkE2ECase.java  | 2 +-
 .../org/apache/flink/tests/util/kafka/KafkaSourceE2ECase.java | 2 +-
 .../flink/tests/util/kafka/SQLClientSchemaRegistryITCase.java | 2 +-
 .../java/org/apache/flink/tests/util/kafka/SmokeKafkaITCase.java  | 2 +-
 .../connector/kafka/sink/FlinkKafkaInternalProducerITCase.java| 2 +-
 .../org/apache/flink/connector/kafka/sink/KafkaSinkITCase.java| 4 ++--
 .../flink/connector/kafka/sink/KafkaTransactionLogITCase.java | 2 +-
 .../org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java  | 2 +-
 .../apache/flink/connector/kafka/source/KafkaSourceITCase.java| 2 +-
 .../flink/connector/kafka/testutils}/DockerImageVersions.java | 8 +---
 .../flink/connector/kafka/testutils/TwoKafkaContainers.java   | 2 --
 .../streaming/connectors/kafka/KafkaTestEnvironmentImpl.java  | 2 +-
 .../kafka/internals/metrics/KafkaMetricMutableWrapperTest.java| 2 +-
 .../streaming/connectors/kafka/table/KafkaTableTestBase.java  | 2 +-
 pom.xml   | 2 +-
 15 files changed, 19 insertions(+), 19 deletions(-)

diff --git 
a/flink-connector-kafka-e2e-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/KafkaSinkE2ECase.java
 
b/flink-connector-kafka-e2e-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/KafkaSinkE2ECase.java
index b22e8a38..e18c035b 100644
--- 
a/flink-connector-kafka-e2e-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/KafkaSinkE2ECase.java
+++ 
b/flink-connector-kafka-e2e-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/KafkaSinkE2ECase.java
@@ -19,6 +19,7 @@
 package org.apache.flink.tests.util.kafka;
 
 import 
org.apache.flink.connector.kafka.sink.testutils.KafkaSinkExternalContextFactory;
+import org.apache.flink.connector.kafka.testutils.DockerImageVersions;
 import 
org.apache.flink.connector.testframe.container.FlinkContainerTestEnvironment;
 import 
org.apache.flink.connector.testframe.external.DefaultContainerizedExternalSystem;
 import org.apache.flink.connector.testframe.junit.annotations.TestContext;
@@ -28,7 +29,6 @@ import 
org.apache.flink.connector.testframe.junit.annotations.TestSemantics;
 import org.apache.flink.connector.testframe.testsuites.SinkTestSuiteBase;
 import org.apache.flink.streaming.api.CheckpointingMode;
 import org.apache.flink.test.resources.ResourceTestUtils;
-import org.apache.flink.util.DockerImageVersions;
 
 import org.testcontainers.containers.KafkaContainer;
 import org.testcontainers.utility.DockerImageName;
diff --git 
a/flink-connector-kafka-e2e-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/KafkaSourceE2ECase.java
 
b/flink-connector-kafka-e2e-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/KafkaSourceE2ECase.java
index 4a036df2..1a2ac1f2 100644
--- 
a/flink-connector-kafka-e2e-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/KafkaSourceE2ECase.java
+++ 
b/flink-connector-kafka-e2e-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/KafkaSourceE2ECase.java
@@ -18,6 +18,7 @@
 
 package org.apache.flink.tests.util.kafka;
 
+import org.apache.flink.connector.kafka.testutils.DockerImageVersions;
 import 
org.apache.flink.connector.kafka.testutils.KafkaSourceExternalContextFactory;
 import 
org.apache.flink.connector.testframe.container.FlinkContainerTestEnvironment;
 import 
org.apache.flink.connector.testframe.external.DefaultContainerizedExternalSystem;
@@ -28,7 +29,6 @@ import 
org.apache.flink.connector.testframe.junit.annotations.TestSemantics;
 import org.apache.flink.connector.testframe.testsuites.SourceTestSuiteBase;
 import org.apache.flink.streaming.api.CheckpointingMode;
 import org.apache.flink.test.resources.ResourceTestUtils;
-import org.apache.flink.util.DockerImageVersions;
 
 import org.testcontainers.containers.KafkaContainer;
 import org.testcontainers.utility.DockerIma

(flink-connector-kafka) branch main updated: [FLINK-34244] Update Confluent Platform to 7.4.4. This closes #81

2024-02-08 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


The following commit(s) were added to refs/heads/main by this push:
 new cfb275b4 [FLINK-34244] Update Confluent Platform to 7.4.4. This closes 
#81
cfb275b4 is described below

commit cfb275b478ff97e9105c5ffaf20224f59a89ebd7
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Fri Feb 9 08:52:36 2024 +0100

[FLINK-34244] Update Confluent Platform to 7.4.4. This closes #81

* Make sure that all tests use the central DockerImageVersions

* Update Confluent Platform to 7.4.4
---
 .../java/org/apache/flink/tests/util/kafka/KafkaSinkE2ECase.java  | 2 +-
 .../org/apache/flink/tests/util/kafka/KafkaSourceE2ECase.java | 2 +-
 .../flink/tests/util/kafka/SQLClientSchemaRegistryITCase.java | 2 +-
 .../java/org/apache/flink/tests/util/kafka/SmokeKafkaITCase.java  | 2 +-
 .../connector/kafka/sink/FlinkKafkaInternalProducerITCase.java| 2 +-
 .../org/apache/flink/connector/kafka/sink/KafkaSinkITCase.java| 4 ++--
 .../flink/connector/kafka/sink/KafkaTransactionLogITCase.java | 2 +-
 .../org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java  | 2 +-
 .../apache/flink/connector/kafka/source/KafkaSourceITCase.java| 2 +-
 .../flink/connector/kafka/testutils}/DockerImageVersions.java | 8 +---
 .../flink/connector/kafka/testutils/TwoKafkaContainers.java   | 2 --
 .../streaming/connectors/kafka/KafkaTestEnvironmentImpl.java  | 2 +-
 .../kafka/internals/metrics/KafkaMetricMutableWrapperTest.java| 2 +-
 .../streaming/connectors/kafka/table/KafkaTableTestBase.java  | 2 +-
 pom.xml   | 2 +-
 15 files changed, 19 insertions(+), 19 deletions(-)

diff --git 
a/flink-connector-kafka-e2e-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/KafkaSinkE2ECase.java
 
b/flink-connector-kafka-e2e-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/KafkaSinkE2ECase.java
index b22e8a38..e18c035b 100644
--- 
a/flink-connector-kafka-e2e-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/KafkaSinkE2ECase.java
+++ 
b/flink-connector-kafka-e2e-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/KafkaSinkE2ECase.java
@@ -19,6 +19,7 @@
 package org.apache.flink.tests.util.kafka;
 
 import 
org.apache.flink.connector.kafka.sink.testutils.KafkaSinkExternalContextFactory;
+import org.apache.flink.connector.kafka.testutils.DockerImageVersions;
 import 
org.apache.flink.connector.testframe.container.FlinkContainerTestEnvironment;
 import 
org.apache.flink.connector.testframe.external.DefaultContainerizedExternalSystem;
 import org.apache.flink.connector.testframe.junit.annotations.TestContext;
@@ -28,7 +29,6 @@ import 
org.apache.flink.connector.testframe.junit.annotations.TestSemantics;
 import org.apache.flink.connector.testframe.testsuites.SinkTestSuiteBase;
 import org.apache.flink.streaming.api.CheckpointingMode;
 import org.apache.flink.test.resources.ResourceTestUtils;
-import org.apache.flink.util.DockerImageVersions;
 
 import org.testcontainers.containers.KafkaContainer;
 import org.testcontainers.utility.DockerImageName;
diff --git 
a/flink-connector-kafka-e2e-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/KafkaSourceE2ECase.java
 
b/flink-connector-kafka-e2e-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/KafkaSourceE2ECase.java
index 4a036df2..1a2ac1f2 100644
--- 
a/flink-connector-kafka-e2e-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/KafkaSourceE2ECase.java
+++ 
b/flink-connector-kafka-e2e-tests/flink-end-to-end-tests-common-kafka/src/test/java/org/apache/flink/tests/util/kafka/KafkaSourceE2ECase.java
@@ -18,6 +18,7 @@
 
 package org.apache.flink.tests.util.kafka;
 
+import org.apache.flink.connector.kafka.testutils.DockerImageVersions;
 import 
org.apache.flink.connector.kafka.testutils.KafkaSourceExternalContextFactory;
 import 
org.apache.flink.connector.testframe.container.FlinkContainerTestEnvironment;
 import 
org.apache.flink.connector.testframe.external.DefaultContainerizedExternalSystem;
@@ -28,7 +29,6 @@ import 
org.apache.flink.connector.testframe.junit.annotations.TestSemantics;
 import org.apache.flink.connector.testframe.testsuites.SourceTestSuiteBase;
 import org.apache.flink.streaming.api.CheckpointingMode;
 import org.apache.flink.test.resources.ResourceTestUtils;
-import org.apache.flink.util.DockerImageVersions;
 
 import org.testcontainers.containers.KafkaContainer;
 import org.testcontainers.utility.DockerImageName;
diff --git 
a/flink-connector-kafka-e2e-tests/flink-end-to-end-tests-

(flink) branch master updated (01cdc703ee6 -> a33a0576364)

2024-02-08 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


from 01cdc703ee6 [FLINK-24239] Event time temporal join should support 
values from array, map, row, etc. as join key (#24253)
 add 6bfc53c1edf [FLINK-34244] Upgrade Confluent Platform Avro Schema 
Registry to version 7.5.3
 add a33a0576364 [hotfix] Remove no longer necessary `kafka.version` 
parameters

No new revisions were added by this update.

Summary of changes:
 flink-formats/flink-avro-confluent-registry/pom.xml  |  3 +--
 .../src/main/resources/META-INF/NOTICE   | 20 ++--
 flink-python/pom.xml |  1 -
 3 files changed, 7 insertions(+), 17 deletions(-)



(flink-web) 01/02: Add Kafka 3.1.0 connector

2024-02-07 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit 76eb0eaac051174ec6256a470e2327606bb5efc9
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Fri Jan 26 14:00:21 2024 +0100

Add Kafka 3.1.0 connector
---
 docs/data/flink_connectors.yml | 8 
 docs/data/release_archive.yml  | 5 +
 2 files changed, 9 insertions(+), 4 deletions(-)

diff --git a/docs/data/flink_connectors.yml b/docs/data/flink_connectors.yml
index 1dd134fd9..4a44865bb 100644
--- a/docs/data/flink_connectors.yml
+++ b/docs/data/flink_connectors.yml
@@ -58,10 +58,10 @@ jdbc:
   compatibility: ["1.16.x", "1.17.x"]
 
 kafka:
-  name: "Apache Flink Kafka Connector 3.0.2"
-  source_release_url: 
"https://www.apache.org/dyn/closer.lua/flink/flink-connector-kafka-3.0.2/flink-connector-kafka-3.0.2-src.tgz;
-  source_release_asc_url: 
"https://downloads.apache.org/flink/flink-connector-kafka-3.0.2/flink-connector-kafka-3.0.2-src.tgz.asc;
-  source_release_sha512_url: 
"https://downloads.apache.org/flink/flink-connector-kafka-3.0.2/flink-connector-kafka-3.0.2-src.tgz.sha512;
+  name: "Apache Flink Kafka Connector 3.1.0"
+  source_release_url: 
"https://www.apache.org/dyn/closer.lua/flink/flink-connector-kafka-3.1.0/flink-connector-kafka-3.1.0-src.tgz;
+  source_release_asc_url: 
"https://downloads.apache.org/flink/flink-connector-kafka-3.1.0/flink-connector-kafka-3.1.0-src.tgz.asc;
+  source_release_sha512_url: 
"https://downloads.apache.org/flink/flink-connector-kafka-3.1.0/flink-connector-kafka-3.1.0-src.tgz.sha512;
   compatibility: ["1.17.x", "1.18.x"]
 
 opensearch:
diff --git a/docs/data/release_archive.yml b/docs/data/release_archive.yml
index 788b1e411..9ea3b7f1f 100644
--- a/docs/data/release_archive.yml
+++ b/docs/data/release_archive.yml
@@ -564,6 +564,11 @@ release_archive:
   version: 1.1.0
   release_date: 2024-02-01
   filename: "opensearch"
+- name: "Flink Kafka Connector"
+  connector: "kafka"
+  version: 3.1.0
+  release_date: 2024-02-07
+  filename: "kafka"
 
   flink_shaded:
 -



(flink-web) branch asf-site updated (ec2e5c2b4 -> c3653a465)

2024-02-07 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git


from ec2e5c2b4 [FLINK-34365] Delete repeated pages in Chinese Flink website 
and correct the Paimon url (#713)
 new 76eb0eaac Add Kafka 3.1.0 connector
 new c3653a465 Rebuild website

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../08/26/apache-flink-0.6-available/index.html|   2 +-
 .../09/26/apache-flink-0.6.1-available/index.html  |   2 +-
 content/2014/10/03/upcoming-events/index.html  |   2 +-
 .../11/04/apache-flink-0.7.0-available/index.html  |   2 +-
 .../11/18/hadoop-compatibility-in-flink/index.html |   2 +-
 .../index.html |   2 +-
 .../01/21/apache-flink-0.8.0-available/index.html  |   2 +-
 .../january-2015-in-the-flink-community/index.html |   2 +-
 .../02/09/introducing-flink-streaming/index.html   |   2 +-
 .../index.html |   2 +-
 .../index.html |   2 +-
 .../march-2015-in-the-flink-community/index.html   |   2 +-
 .../index.html |   2 +-
 .../05/11/juggling-with-bits-and-bytes/index.html  |   2 +-
 .../april-2015-in-the-flink-community/index.html   |   2 +-
 .../06/24/announcing-apache-flink-0.9.0/index.html |   2 +-
 .../index.html |   2 +-
 .../09/01/apache-flink-0.9.1-available/index.html  |   2 +-
 .../09/03/announcing-flink-forward-2015/index.html |   2 +-
 .../index.html |   2 +-
 .../16/announcing-apache-flink-0.10.0/index.html   |   2 +-
 .../2015/11/27/flink-0.10.1-released/index.html|   2 +-
 .../index.html |   2 +-
 .../index.html |   2 +-
 .../index.html |   2 +-
 .../2016/02/11/flink-0.10.2-released/index.html|   2 +-
 .../03/08/announcing-apache-flink-1.0.0/index.html |   2 +-
 content/2016/04/06/flink-1.0.1-released/index.html |   2 +-
 .../index.html |   2 +-
 .../index.html |   2 +-
 content/2016/04/22/flink-1.0.2-released/index.html |   2 +-
 content/2016/05/11/flink-1.0.3-released/index.html |   2 +-
 .../index.html |   2 +-
 .../08/04/announcing-apache-flink-1.1.0/index.html |   2 +-
 content/2016/08/04/flink-1.1.1-released/index.html |   2 +-
 .../index.html |   2 +-
 .../09/05/apache-flink-1.1.2-released/index.html   |   2 +-
 .../10/12/apache-flink-1.1.3-released/index.html   |   2 +-
 .../apache-flink-in-2016-year-in-review/index.html |   2 +-
 .../12/21/apache-flink-1.1.4-released/index.html   |   2 +-
 .../02/06/announcing-apache-flink-1.2.0/index.html |   2 +-
 .../03/23/apache-flink-1.1.5-released/index.html   |   2 +-
 .../index.html |   2 +-
 .../index.html |   2 +-
 .../04/26/apache-flink-1.2.1-released/index.html   |   2 +-
 .../index.html |   2 +-
 .../index.html |   2 +-
 .../06/23/apache-flink-1.3.1-released/index.html   |   2 +-
 .../index.html |   2 +-
 .../08/05/apache-flink-1.3.2-released/index.html   |   2 +-
 .../index.html |   2 +-
 .../index.html |   2 +-
 .../apache-flink-in-2017-year-in-review/index.html |   2 +-
 .../index.html |   2 +-
 .../02/15/apache-flink-1.4.1-released/index.html   |   2 +-
 .../index.html |   2 +-
 .../03/08/apache-flink-1.4.2-released/index.html   |   2 +-
 .../03/15/apache-flink-1.3.3-released/index.html   |   2 +-
 .../index.html |   2 +-
 .../07/12/apache-flink-1.5.1-released/index.html   |   2 +-
 .../07/31/apache-flink-1.5.2-released/index.html   |   2 +-
 .../index.html |   2 +-
 .../08/21/apache-flink-1.5.3-released/index.html   |   2 +-
 .../09/20/apache-flink-1.5.4-released/index.html   |   2 +-
 .../09/20/apache-flink-1.6.1-released/index.html   |   2 +-
 .../10/29/apache-flink-1.5.5-released/index.html   |   2 +-
 .../10/29/apache-flink-1.6.2-released/index.html   |   2 +-
 .../index.html |   2 +-
 .../12/21/apache-flink-1.7.1-released/index.html   |   2 +-
 .../12/22/apache-flink-1.6.3-released/index.html   |   2 +-
 .../12/26/apache-flink-1.5.6-released/index.html   |   

(flink-connector-kafka) branch v3.1 updated: [hotfix] Update copyright year to 2024 (#82)

2024-02-07 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch v3.1
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


The following commit(s) were added to refs/heads/v3.1 by this push:
 new 83bd63bc [hotfix] Update copyright year to 2024 (#82)
83bd63bc is described below

commit 83bd63bcad9fd7cf797ca5246998720cdbf88a1d
Author: Hang Ruan 
AuthorDate: Tue Jan 30 09:53:58 2024 +0800

[hotfix] Update copyright year to 2024 (#82)

(cherry picked from commit abf4563e0342abe25dc28bb6b5457bb971381f61)
---
 flink-sql-connector-kafka/src/main/resources/META-INF/NOTICE | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/flink-sql-connector-kafka/src/main/resources/META-INF/NOTICE 
b/flink-sql-connector-kafka/src/main/resources/META-INF/NOTICE
index 1ca013b7..926976da 100644
--- a/flink-sql-connector-kafka/src/main/resources/META-INF/NOTICE
+++ b/flink-sql-connector-kafka/src/main/resources/META-INF/NOTICE
@@ -1,5 +1,5 @@
 flink-sql-connector-kafka
-Copyright 2014-2023 The Apache Software Foundation
+Copyright 2014-2024 The Apache Software Foundation
 
 This product includes software developed at
 The Apache Software Foundation (http://www.apache.org/).



(flink-connector-kafka) branch v3.1 updated: [hotfix] Update shortcode in Kafka Datastream documentation so that the download links appear

2024-02-07 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch v3.1
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


The following commit(s) were added to refs/heads/v3.1 by this push:
 new 3e4ff056 [hotfix] Update shortcode in Kafka Datastream documentation 
so that the download links appear
3e4ff056 is described below

commit 3e4ff056bd478050abc08b22aac205dcfc504714
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Wed Feb 7 11:12:13 2024 +0100

[hotfix] Update shortcode in Kafka Datastream documentation so that the 
download links appear

(cherry picked from commit 6d0ffcd7f4bfa58a8ede1404d4c082614be4c753)
---
 docs/content.zh/docs/connectors/datastream/dynamic-kafka.md | 2 +-
 docs/content.zh/docs/connectors/datastream/kafka.md | 2 +-
 docs/content/docs/connectors/datastream/dynamic-kafka.md| 2 +-
 docs/content/docs/connectors/datastream/kafka.md| 2 +-
 4 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/docs/content.zh/docs/connectors/datastream/dynamic-kafka.md 
b/docs/content.zh/docs/connectors/datastream/dynamic-kafka.md
index e46adca3..39adb936 100644
--- a/docs/content.zh/docs/connectors/datastream/dynamic-kafka.md
+++ b/docs/content.zh/docs/connectors/datastream/dynamic-kafka.md
@@ -36,7 +36,7 @@ makes these operations automated so that they are transparent 
to Kafka consumers
 
 For details on Kafka compatibility, please refer to the official [Kafka 
documentation](https://kafka.apache.org/protocol.html#protocol_compatibility).
 
-{{< connector_artifact flink-connector-kafka 3.1.0 >}}
+{{< connector_artifact flink-connector-kafka kafka >}}
 
 Flink's streaming connectors are not part of the binary distribution.
 See how to link with them for cluster execution [here]({{< ref 
"docs/dev/configuration/overview" >}}).
diff --git a/docs/content.zh/docs/connectors/datastream/kafka.md 
b/docs/content.zh/docs/connectors/datastream/kafka.md
index 35fd281f..9c54aec7 100644
--- a/docs/content.zh/docs/connectors/datastream/kafka.md
+++ b/docs/content.zh/docs/connectors/datastream/kafka.md
@@ -36,7 +36,7 @@ Apache Flink 集成了通用的 Kafka 连接器,它会尽力与 Kafka client 
 当前 Kafka client 向后兼容 0.10.0 或更高版本的 Kafka broker。
 有关 Kafka 兼容性的更多细节,请参考 [Kafka 
官方文档](https://kafka.apache.org/protocol.html#protocol_compatibility)。
 
-{{< connector_artifact flink-connector-kafka 3.0.0 >}}
+{{< connector_artifact flink-connector-kafka kafka >}}
 
 如果使用 Kafka source,```flink-connector-base``` 也需要包含在依赖中:
 
diff --git a/docs/content/docs/connectors/datastream/dynamic-kafka.md 
b/docs/content/docs/connectors/datastream/dynamic-kafka.md
index 4c6e38fc..08fa2401 100644
--- a/docs/content/docs/connectors/datastream/dynamic-kafka.md
+++ b/docs/content/docs/connectors/datastream/dynamic-kafka.md
@@ -36,7 +36,7 @@ makes these operations automated so that they are transparent 
to Kafka consumers
 
 For details on Kafka compatibility, please refer to the official [Kafka 
documentation](https://kafka.apache.org/protocol.html#protocol_compatibility).
 
-{{< connector_artifact flink-connector-kafka 3.1.0 >}}
+{{< connector_artifact flink-connector-kafka kafka >}}
 
 Flink's streaming connectors are not part of the binary distribution.
 See how to link with them for cluster execution [here]({{< ref 
"docs/dev/configuration/overview" >}}).
diff --git a/docs/content/docs/connectors/datastream/kafka.md 
b/docs/content/docs/connectors/datastream/kafka.md
index 422ed9e3..0ab35af6 100644
--- a/docs/content/docs/connectors/datastream/kafka.md
+++ b/docs/content/docs/connectors/datastream/kafka.md
@@ -36,7 +36,7 @@ The version of the client it uses may change between Flink 
releases.
 Modern Kafka clients are backwards compatible with broker versions 0.10.0 or 
later.
 For details on Kafka compatibility, please refer to the official [Kafka 
documentation](https://kafka.apache.org/protocol.html#protocol_compatibility).
 
-{{< connector_artifact flink-connector-kafka 3.0.0 >}}
+{{< connector_artifact flink-connector-kafka kafka >}}
 
 Flink's streaming connectors are not part of the binary distribution.
 See how to link with them for cluster execution [here]({{< ref 
"docs/dev/configuration/overview" >}}).



(flink-connector-kafka) branch main updated: [hotfix] Update shortcode in Kafka Datastream documentation so that the download links appear

2024-02-07 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


The following commit(s) were added to refs/heads/main by this push:
 new 6d0ffcd7 [hotfix] Update shortcode in Kafka Datastream documentation 
so that the download links appear
6d0ffcd7 is described below

commit 6d0ffcd7f4bfa58a8ede1404d4c082614be4c753
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Wed Feb 7 11:12:13 2024 +0100

[hotfix] Update shortcode in Kafka Datastream documentation so that the 
download links appear
---
 docs/content.zh/docs/connectors/datastream/dynamic-kafka.md | 2 +-
 docs/content.zh/docs/connectors/datastream/kafka.md | 2 +-
 docs/content/docs/connectors/datastream/dynamic-kafka.md| 2 +-
 docs/content/docs/connectors/datastream/kafka.md| 2 +-
 4 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/docs/content.zh/docs/connectors/datastream/dynamic-kafka.md 
b/docs/content.zh/docs/connectors/datastream/dynamic-kafka.md
index e46adca3..39adb936 100644
--- a/docs/content.zh/docs/connectors/datastream/dynamic-kafka.md
+++ b/docs/content.zh/docs/connectors/datastream/dynamic-kafka.md
@@ -36,7 +36,7 @@ makes these operations automated so that they are transparent 
to Kafka consumers
 
 For details on Kafka compatibility, please refer to the official [Kafka 
documentation](https://kafka.apache.org/protocol.html#protocol_compatibility).
 
-{{< connector_artifact flink-connector-kafka 3.1.0 >}}
+{{< connector_artifact flink-connector-kafka kafka >}}
 
 Flink's streaming connectors are not part of the binary distribution.
 See how to link with them for cluster execution [here]({{< ref 
"docs/dev/configuration/overview" >}}).
diff --git a/docs/content.zh/docs/connectors/datastream/kafka.md 
b/docs/content.zh/docs/connectors/datastream/kafka.md
index 35fd281f..9c54aec7 100644
--- a/docs/content.zh/docs/connectors/datastream/kafka.md
+++ b/docs/content.zh/docs/connectors/datastream/kafka.md
@@ -36,7 +36,7 @@ Apache Flink 集成了通用的 Kafka 连接器,它会尽力与 Kafka client 
 当前 Kafka client 向后兼容 0.10.0 或更高版本的 Kafka broker。
 有关 Kafka 兼容性的更多细节,请参考 [Kafka 
官方文档](https://kafka.apache.org/protocol.html#protocol_compatibility)。
 
-{{< connector_artifact flink-connector-kafka 3.0.0 >}}
+{{< connector_artifact flink-connector-kafka kafka >}}
 
 如果使用 Kafka source,```flink-connector-base``` 也需要包含在依赖中:
 
diff --git a/docs/content/docs/connectors/datastream/dynamic-kafka.md 
b/docs/content/docs/connectors/datastream/dynamic-kafka.md
index 4c6e38fc..08fa2401 100644
--- a/docs/content/docs/connectors/datastream/dynamic-kafka.md
+++ b/docs/content/docs/connectors/datastream/dynamic-kafka.md
@@ -36,7 +36,7 @@ makes these operations automated so that they are transparent 
to Kafka consumers
 
 For details on Kafka compatibility, please refer to the official [Kafka 
documentation](https://kafka.apache.org/protocol.html#protocol_compatibility).
 
-{{< connector_artifact flink-connector-kafka 3.1.0 >}}
+{{< connector_artifact flink-connector-kafka kafka >}}
 
 Flink's streaming connectors are not part of the binary distribution.
 See how to link with them for cluster execution [here]({{< ref 
"docs/dev/configuration/overview" >}}).
diff --git a/docs/content/docs/connectors/datastream/kafka.md 
b/docs/content/docs/connectors/datastream/kafka.md
index 422ed9e3..0ab35af6 100644
--- a/docs/content/docs/connectors/datastream/kafka.md
+++ b/docs/content/docs/connectors/datastream/kafka.md
@@ -36,7 +36,7 @@ The version of the client it uses may change between Flink 
releases.
 Modern Kafka clients are backwards compatible with broker versions 0.10.0 or 
later.
 For details on Kafka compatibility, please refer to the official [Kafka 
documentation](https://kafka.apache.org/protocol.html#protocol_compatibility).
 
-{{< connector_artifact flink-connector-kafka 3.0.0 >}}
+{{< connector_artifact flink-connector-kafka kafka >}}
 
 Flink's streaming connectors are not part of the binary distribution.
 See how to link with them for cluster execution [here]({{< ref 
"docs/dev/configuration/overview" >}}).



(flink-connector-kafka) branch v3.1 updated: [hotfix] Make the upsert-kafka artifacts point to the regular kafka artifacts in the documentation

2024-02-07 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch v3.1
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


The following commit(s) were added to refs/heads/v3.1 by this push:
 new b4d4eea1 [hotfix] Make the upsert-kafka artifacts point to the regular 
kafka artifacts in the documentation
b4d4eea1 is described below

commit b4d4eea170e1f4931d43fa43a640b522f5f75baf
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Wed Feb 7 11:03:02 2024 +0100

[hotfix] Make the upsert-kafka artifacts point to the regular kafka 
artifacts in the documentation

(cherry picked from commit ab356b4d0232b47ae6b7b507b3a9f1ccca862b98)
---
 docs/content.zh/docs/connectors/table/upsert-kafka.md | 2 +-
 docs/content/docs/connectors/table/upsert-kafka.md| 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/content.zh/docs/connectors/table/upsert-kafka.md 
b/docs/content.zh/docs/connectors/table/upsert-kafka.md
index a7e571fd..9612de0b 100644
--- a/docs/content.zh/docs/connectors/table/upsert-kafka.md
+++ b/docs/content.zh/docs/connectors/table/upsert-kafka.md
@@ -38,7 +38,7 @@ Upsert Kafka 连接器支持以 upsert 方式从 Kafka topic 中读取数据并
 依赖
 
 
-{{< sql_download_table "upsert-kafka" >}}
+{{< sql_connector_download_table "kafka" >}}
 
 Upsert Kafka 连接器不是二进制发行版的一部分,请查阅[这里]({{< ref "docs/dev/configuration/overview" 
>}})了解如何在集群运行中引用 Upsert Kafka 连接器。
 
diff --git a/docs/content/docs/connectors/table/upsert-kafka.md 
b/docs/content/docs/connectors/table/upsert-kafka.md
index 61237640..814ff9bc 100644
--- a/docs/content/docs/connectors/table/upsert-kafka.md
+++ b/docs/content/docs/connectors/table/upsert-kafka.md
@@ -47,7 +47,7 @@ key will fall into the same partition.
 Dependencies
 
 
-{{< sql_connector_download_table "upsert-kafka" >}}
+{{< sql_connector_download_table "kafka" >}}
 
 The Upsert Kafka connector is not part of the binary distribution.
 See how to link with it for cluster execution [here]({{< ref 
"docs/dev/configuration/overview" >}}).



(flink-connector-kafka) branch main updated: [hotfix] Make the upsert-kafka artifacts point to the regular kafka artifacts in the documentation

2024-02-07 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


The following commit(s) were added to refs/heads/main by this push:
 new ab356b4d [hotfix] Make the upsert-kafka artifacts point to the regular 
kafka artifacts in the documentation
ab356b4d is described below

commit ab356b4d0232b47ae6b7b507b3a9f1ccca862b98
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Wed Feb 7 11:03:02 2024 +0100

[hotfix] Make the upsert-kafka artifacts point to the regular kafka 
artifacts in the documentation
---
 docs/content.zh/docs/connectors/table/upsert-kafka.md | 2 +-
 docs/content/docs/connectors/table/upsert-kafka.md| 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/content.zh/docs/connectors/table/upsert-kafka.md 
b/docs/content.zh/docs/connectors/table/upsert-kafka.md
index a7e571fd..9612de0b 100644
--- a/docs/content.zh/docs/connectors/table/upsert-kafka.md
+++ b/docs/content.zh/docs/connectors/table/upsert-kafka.md
@@ -38,7 +38,7 @@ Upsert Kafka 连接器支持以 upsert 方式从 Kafka topic 中读取数据并
 依赖
 
 
-{{< sql_download_table "upsert-kafka" >}}
+{{< sql_connector_download_table "kafka" >}}
 
 Upsert Kafka 连接器不是二进制发行版的一部分,请查阅[这里]({{< ref "docs/dev/configuration/overview" 
>}})了解如何在集群运行中引用 Upsert Kafka 连接器。
 
diff --git a/docs/content/docs/connectors/table/upsert-kafka.md 
b/docs/content/docs/connectors/table/upsert-kafka.md
index 61237640..814ff9bc 100644
--- a/docs/content/docs/connectors/table/upsert-kafka.md
+++ b/docs/content/docs/connectors/table/upsert-kafka.md
@@ -47,7 +47,7 @@ key will fall into the same partition.
 Dependencies
 
 
-{{< sql_connector_download_table "upsert-kafka" >}}
+{{< sql_connector_download_table "kafka" >}}
 
 The Upsert Kafka connector is not part of the binary distribution.
 See how to link with it for cluster execution [here]({{< ref 
"docs/dev/configuration/overview" >}}).



(flink) branch master updated (f1fba33d85a -> eb1f7c8f998)

2024-02-07 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


from f1fba33d85a [FLINK-34345][runtime] Remove TaskExecutorManager related 
logic
 add eb1f7c8f998 [hotfix] Include `release-1.19` for building documentation

No new revisions were added by this update.

Summary of changes:
 .github/workflows/docs.yml | 1 +
 1 file changed, 1 insertion(+)



(flink) branch release-1.17 updated: [release] Integrate Kafka v3.1 connector documentation

2024-02-07 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch release-1.17
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.17 by this push:
 new caa565a5550 [release] Integrate Kafka v3.1 connector documentation
caa565a5550 is described below

commit caa565a555006b219f71662cb7df287c6d63683d
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Wed Feb 7 09:11:49 2024 +0100

[release] Integrate Kafka v3.1 connector documentation
---
 docs/setup_docs.sh | 1 +
 1 file changed, 1 insertion(+)

diff --git a/docs/setup_docs.sh b/docs/setup_docs.sh
index 2c06d1e2772..2e1ea0ef858 100755
--- a/docs/setup_docs.sh
+++ b/docs/setup_docs.sh
@@ -53,6 +53,7 @@ integrate_connector_docs gcp-pubsub v3.0
 integrate_connector_docs mongodb v1.0
 integrate_connector_docs opensearch v1.1
 integrate_connector_docs hbase v3.0
+integrate_connector_docs kafka v3.1
 
 cd ..
 rm -rf tmp



(flink) branch release-1.18 updated: [release] Integrate Kafka v3.1 connector documentation

2024-02-07 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch release-1.18
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.18 by this push:
 new bee4157729c [release] Integrate Kafka v3.1 connector documentation
bee4157729c is described below

commit bee4157729c53f29e8b0940b365bad5710b6ecb6
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Wed Feb 7 09:10:44 2024 +0100

[release] Integrate Kafka v3.1 connector documentation
---
 docs/setup_docs.sh | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/setup_docs.sh b/docs/setup_docs.sh
index f92a2528bac..50209e93cb1 100755
--- a/docs/setup_docs.sh
+++ b/docs/setup_docs.sh
@@ -52,7 +52,7 @@ integrate_connector_docs rabbitmq v3.0
 integrate_connector_docs gcp-pubsub v3.0
 integrate_connector_docs mongodb v1.0
 integrate_connector_docs opensearch v1.1
-integrate_connector_docs kafka v3.0
+integrate_connector_docs kafka v3.1
 integrate_connector_docs hbase v3.0
 
 cd ..



(flink-connector-kafka) branch v3.1 updated: [release] Add Flink compatibility table for documentation purposes

2024-02-07 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch v3.1
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


The following commit(s) were added to refs/heads/v3.1 by this push:
 new 1b34b55e [release] Add Flink compatibility table for documentation 
purposes
1b34b55e is described below

commit 1b34b55e85487c99e072c6a7df8ba14620e2ef02
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Wed Feb 7 09:09:53 2024 +0100

[release] Add Flink compatibility table for documentation purposes
---
 docs/data/kafka.yml | 23 +++
 1 file changed, 23 insertions(+)

diff --git a/docs/data/kafka.yml b/docs/data/kafka.yml
new file mode 100644
index ..699243d1
--- /dev/null
+++ b/docs/data/kafka.yml
@@ -0,0 +1,23 @@
+
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#  http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+version: 3.1.0
+flink_compatibility: [1.17, 1.18]
+variants:
+  - maven: flink-connector-kafka
+sql_url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-kafka/$full_version/flink-sql-connector-kafka-$full_version.jar
\ No newline at end of file



(flink) branch master updated: [FLINK-34337][Core] Sink.InitContextWrapper should implement metadataConsumer method. This closes #24249

2024-02-06 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 03b45844228 [FLINK-34337][Core] Sink.InitContextWrapper should 
implement metadataConsumer method. This closes #24249
03b45844228 is described below

commit 03b4584422826d2819d571871dfef4efced19f01
Author: Jiabao Sun 
AuthorDate: Wed Feb 7 03:19:43 2024 +0800

[FLINK-34337][Core] Sink.InitContextWrapper should implement 
metadataConsumer method. This closes #24249

* Sink.InitContextWrapper should implement metadataConsumer method

* Add test for InitContextWrapper
---
 .../org/apache/flink/api/connector/sink2/Sink.java |  6 ++
 .../operators/sink/SinkWriterOperatorTestBase.java | 94 ++
 2 files changed, 100 insertions(+)

diff --git 
a/flink-core/src/main/java/org/apache/flink/api/connector/sink2/Sink.java 
b/flink-core/src/main/java/org/apache/flink/api/connector/sink2/Sink.java
index 7558fb2fa8e..49d3601c40c 100644
--- a/flink-core/src/main/java/org/apache/flink/api/connector/sink2/Sink.java
+++ b/flink-core/src/main/java/org/apache/flink/api/connector/sink2/Sink.java
@@ -218,5 +218,11 @@ public interface Sink extends Serializable {
 public  TypeSerializer createInputSerializer() {
 return wrapped.createInputSerializer();
 }
+
+@Experimental
+@Override
+public  Optional> metadataConsumer() {
+return wrapped.metadataConsumer();
+}
 }
 }
diff --git 
a/flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/operators/sink/SinkWriterOperatorTestBase.java
 
b/flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/operators/sink/SinkWriterOperatorTestBase.java
index c4627605f71..debe699c3a8 100644
--- 
a/flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/operators/sink/SinkWriterOperatorTestBase.java
+++ 
b/flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/operators/sink/SinkWriterOperatorTestBase.java
@@ -26,6 +26,8 @@ import org.apache.flink.api.common.typeutils.TypeSerializer;
 import org.apache.flink.api.common.typeutils.base.StringSerializer;
 import 
org.apache.flink.api.common.typeutils.base.array.BytePrimitiveArraySerializer;
 import org.apache.flink.api.connector.sink2.Sink;
+import org.apache.flink.api.connector.sink2.SinkWriter;
+import org.apache.flink.api.connector.sink2.WriterInitContext;
 import org.apache.flink.api.java.tuple.Tuple3;
 import org.apache.flink.core.io.SimpleVersionedSerialization;
 import org.apache.flink.core.io.SimpleVersionedSerializer;
@@ -55,13 +57,17 @@ import org.junit.jupiter.params.provider.ValueSource;
 import javax.annotation.Nullable;
 
 import java.io.IOException;
+import java.lang.reflect.Proxy;
 import java.util.ArrayList;
 import java.util.Arrays;
 import java.util.Collection;
 import java.util.Collections;
 import java.util.List;
+import java.util.Optional;
 import java.util.Queue;
+import java.util.concurrent.atomic.AtomicBoolean;
 import java.util.concurrent.atomic.AtomicReference;
+import java.util.function.Consumer;
 import java.util.function.LongSupplier;
 import java.util.function.Supplier;
 
@@ -425,6 +431,94 @@ abstract class SinkWriterOperatorTestBase {
 testHarness.close();
 }
 
+@Test
+void testInitContextWrapper() throws Exception {
+final AtomicReference initContext = new 
AtomicReference<>();
+final AtomicReference originalContext = new 
AtomicReference<>();
+final AtomicBoolean consumed = new AtomicBoolean(false);
+final Consumer metadataConsumer = element -> 
element.set(true);
+
+final Sink sink =
+new Sink() {
+@Override
+public SinkWriter createWriter(WriterInitContext 
context)
+throws IOException {
+WriterInitContext decoratedContext =
+(WriterInitContext)
+Proxy.newProxyInstance(
+
WriterInitContext.class.getClassLoader(),
+new Class[] 
{WriterInitContext.class},
+(proxy, method, args) -> {
+if (method.getName()
+
.equals("metadataConsumer")) {
+return 
Optional.of(metadataConsumer);
+}
+return 
method.invoke(context, args);
+});
+origi

svn commit: r67207 - /dev/flink/flink-connector-kafka-3.1.0-rc1/ /release/flink/flink-connector-kafka-3.1.0/

2024-02-06 Thread martijnvisser
Author: martijnvisser
Date: Tue Feb  6 15:51:47 2024
New Revision: 67207

Log:
Release flink-connector-kafka 3.1.0

Added:
release/flink/flink-connector-kafka-3.1.0/
  - copied from r67206, dev/flink/flink-connector-kafka-3.1.0-rc1/
Removed:
dev/flink/flink-connector-kafka-3.1.0-rc1/



  1   2   3   4   5   6   7   8   9   10   >