(flink) branch release-1.18 updated: [FLINK-34344] Pass JobID to CheckpointStatsTracker

2024-02-12 Thread roman
This is an automated email from the ASF dual-hosted git repository.

roman pushed a commit to branch release-1.18
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.18 by this push:
 new 33fb37aac5f [FLINK-34344] Pass JobID to CheckpointStatsTracker
33fb37aac5f is described below

commit 33fb37aac5fbc709a62d35445879c75a6ba48086
Author: Roman Khachatryan 
AuthorDate: Fri Feb 2 16:02:14 2024 +0100

[FLINK-34344] Pass JobID to CheckpointStatsTracker
---
 .../runtime/checkpoint/CheckpointStatsTracker.java | 11 +--
 .../scheduler/DefaultExecutionGraphFactory.java|  3 ++-
 .../checkpoint/CheckpointCoordinatorFailureTest.java   |  4 +++-
 .../CheckpointCoordinatorMasterHooksTest.java  |  2 +-
 .../runtime/checkpoint/CheckpointCoordinatorTest.java  | 18 --
 .../checkpoint/CheckpointCoordinatorTestingUtils.java  |  2 +-
 .../runtime/checkpoint/CheckpointStatsTrackerTest.java | 11 ++-
 .../flink/runtime/dispatcher/DispatcherTest.java   |  4 +++-
 .../TestingDefaultExecutionGraphBuilder.java   |  3 ++-
 .../AbstractCheckpointStatsHandlerTest.java|  4 +++-
 10 files changed, 34 insertions(+), 28 deletions(-)

diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointStatsTracker.java
 
b/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointStatsTracker.java
index f868f3fb4ba..5a8f72f0805 100644
--- 
a/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointStatsTracker.java
+++ 
b/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointStatsTracker.java
@@ -25,7 +25,6 @@ import org.apache.flink.metrics.Metric;
 import org.apache.flink.metrics.MetricGroup;
 import org.apache.flink.runtime.executiongraph.ExecutionAttemptID;
 import org.apache.flink.runtime.jobgraph.JobVertexID;
-import org.apache.flink.runtime.metrics.groups.JobManagerJobMetricGroup;
 import org.apache.flink.runtime.rest.messages.checkpoints.CheckpointStatistics;
 import org.apache.flink.runtime.rest.util.RestMapperUtils;
 
@@ -106,17 +105,9 @@ public class CheckpointStatsTracker {
  * @param numRememberedCheckpoints Maximum number of checkpoints to 
remember, including in
  * progress ones.
  * @param metricGroup Metric group for exposed metrics
+ * @param jobID ID of the job being checkpointed
  */
 public CheckpointStatsTracker(
-int numRememberedCheckpoints, JobManagerJobMetricGroup 
metricGroup) {
-this(numRememberedCheckpoints, metricGroup, metricGroup.jobId());
-}
-
-public CheckpointStatsTracker(int numRememberedCheckpoints, MetricGroup 
metricGroup) {
-this(numRememberedCheckpoints, metricGroup, new JobID());
-}
-
-private CheckpointStatsTracker(
 int numRememberedCheckpoints, MetricGroup metricGroup, JobID 
jobID) {
 checkArgument(numRememberedCheckpoints >= 0, "Negative number of 
remembered checkpoints");
 this.history = new CheckpointStatsHistory(numRememberedCheckpoints);
diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/scheduler/DefaultExecutionGraphFactory.java
 
b/flink-runtime/src/main/java/org/apache/flink/runtime/scheduler/DefaultExecutionGraphFactory.java
index 29eb7222d95..0c5279f11fd 100644
--- 
a/flink-runtime/src/main/java/org/apache/flink/runtime/scheduler/DefaultExecutionGraphFactory.java
+++ 
b/flink-runtime/src/main/java/org/apache/flink/runtime/scheduler/DefaultExecutionGraphFactory.java
@@ -129,7 +129,8 @@ public class DefaultExecutionGraphFactory implements 
ExecutionGraphFactory {
 new CheckpointStatsTracker(
 configuration.getInteger(
 
WebOptions.CHECKPOINTS_HISTORY_SIZE),
-jobManagerJobMetricGroup));
+jobManagerJobMetricGroup,
+jobManagerJobMetricGroup.jobId()));
 this.isDynamicGraph = isDynamicGraph;
 this.executionJobVertexFactory = 
checkNotNull(executionJobVertexFactory);
 this.nonFinishedHybridPartitionShouldBeUnknown = 
nonFinishedHybridPartitionShouldBeUnknown;
diff --git 
a/flink-runtime/src/test/java/org/apache/flink/runtime/checkpoint/CheckpointCoordinatorFailureTest.java
 
b/flink-runtime/src/test/java/org/apache/flink/runtime/checkpoint/CheckpointCoordinatorFailureTest.java
index 2538072c516..54e06728fe8 100644
--- 
a/flink-runtime/src/test/java/org/apache/flink/runtime/checkpoint/CheckpointCoordinatorFailureTest.java
+++ 
b/flink-runtime/src/test/java/org/apache/flink/runtime/checkpoint/CheckpointCoordinatorFailureTest.java
@@ -18,6 +18,7 @@
 
 package org.apache.flink.runtime.checkpoint;
 
+import org.apache.flink.api.common.JobID;
 import org.apache.flink.api.common.JobStatus;
 import org.apa

(flink) branch release-1.19 updated: [FLINK-34344] Pass JobID to CheckpointStatsTracker

2024-02-12 Thread roman
This is an automated email from the ASF dual-hosted git repository.

roman pushed a commit to branch release-1.19
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.19 by this push:
 new 37756561d99 [FLINK-34344] Pass JobID to CheckpointStatsTracker
37756561d99 is described below

commit 37756561d99ff73ba8cbf445c57f57fe11250867
Author: Roman Khachatryan 
AuthorDate: Fri Feb 2 16:02:14 2024 +0100

[FLINK-34344] Pass JobID to CheckpointStatsTracker
---
 .../runtime/checkpoint/CheckpointStatsTracker.java   |  6 --
 .../scheduler/DefaultExecutionGraphFactory.java  |  3 ++-
 .../checkpoint/CheckpointCoordinatorFailureTest.java |  4 +++-
 .../CheckpointCoordinatorMasterHooksTest.java|  2 +-
 .../checkpoint/CheckpointCoordinatorTest.java| 20 +---
 .../CheckpointCoordinatorTestingUtils.java   |  2 +-
 .../checkpoint/CheckpointStatsTrackerTest.java   | 12 ++--
 .../flink/runtime/dispatcher/DispatcherTest.java |  4 +++-
 .../TestingDefaultExecutionGraphBuilder.java |  3 ++-
 .../AbstractCheckpointStatsHandlerTest.java  |  4 +++-
 10 files changed, 38 insertions(+), 22 deletions(-)

diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointStatsTracker.java
 
b/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointStatsTracker.java
index cf66341fc06..ea04211d6f0 100644
--- 
a/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointStatsTracker.java
+++ 
b/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointStatsTracker.java
@@ -111,9 +111,11 @@ public class CheckpointStatsTracker {
  * @param numRememberedCheckpoints Maximum number of checkpoints to 
remember, including in
  * progress ones.
  * @param metricGroup Metric group for exposed metrics
+ * @param jobID ID of the job being checkpointed
  */
-public CheckpointStatsTracker(int numRememberedCheckpoints, MetricGroup 
metricGroup) {
-this(numRememberedCheckpoints, metricGroup, new JobID(), 
Integer.MAX_VALUE);
+public CheckpointStatsTracker(
+int numRememberedCheckpoints, MetricGroup metricGroup, JobID 
jobID) {
+this(numRememberedCheckpoints, metricGroup, jobID, Integer.MAX_VALUE);
 }
 
 CheckpointStatsTracker(
diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/scheduler/DefaultExecutionGraphFactory.java
 
b/flink-runtime/src/main/java/org/apache/flink/runtime/scheduler/DefaultExecutionGraphFactory.java
index aaeb8b6d4c7..67e91a887a0 100644
--- 
a/flink-runtime/src/main/java/org/apache/flink/runtime/scheduler/DefaultExecutionGraphFactory.java
+++ 
b/flink-runtime/src/main/java/org/apache/flink/runtime/scheduler/DefaultExecutionGraphFactory.java
@@ -129,7 +129,8 @@ public class DefaultExecutionGraphFactory implements 
ExecutionGraphFactory {
 () ->
 new CheckpointStatsTracker(
 
configuration.get(WebOptions.CHECKPOINTS_HISTORY_SIZE),
-jobManagerJobMetricGroup));
+jobManagerJobMetricGroup,
+jobManagerJobMetricGroup.jobId()));
 this.isDynamicGraph = isDynamicGraph;
 this.executionJobVertexFactory = 
checkNotNull(executionJobVertexFactory);
 this.nonFinishedHybridPartitionShouldBeUnknown = 
nonFinishedHybridPartitionShouldBeUnknown;
diff --git 
a/flink-runtime/src/test/java/org/apache/flink/runtime/checkpoint/CheckpointCoordinatorFailureTest.java
 
b/flink-runtime/src/test/java/org/apache/flink/runtime/checkpoint/CheckpointCoordinatorFailureTest.java
index 8873b938f1a..6e6bcffe762 100644
--- 
a/flink-runtime/src/test/java/org/apache/flink/runtime/checkpoint/CheckpointCoordinatorFailureTest.java
+++ 
b/flink-runtime/src/test/java/org/apache/flink/runtime/checkpoint/CheckpointCoordinatorFailureTest.java
@@ -18,6 +18,7 @@
 
 package org.apache.flink.runtime.checkpoint;
 
+import org.apache.flink.api.common.JobID;
 import org.apache.flink.api.common.JobStatus;
 import org.apache.flink.metrics.groups.UnregisteredMetricsGroup;
 import 
org.apache.flink.runtime.checkpoint.CheckpointCoordinatorTestingUtils.CheckpointCoordinatorBuilder;
@@ -212,7 +213,8 @@ class CheckpointCoordinatorFailureTest {
 new FailingCompletedCheckpointStore(failure);
 
 CheckpointStatsTracker statsTracker =
-new CheckpointStatsTracker(Integer.MAX_VALUE, new 
UnregisteredMetricsGroup());
+new CheckpointStatsTracker(
+Integer.MAX_VALUE, new UnregisteredMetricsGroup(), new 
JobID());
 final AtomicInteger cleanupCallCount = new AtomicInteger(0);
 final CheckpointCoordinator checkpointCoordinator =
 new CheckpointCoordinatorBuilder(

(flink) branch master updated: [FLINK-33958] Fix IntervalJoin restore test flakiness

2024-02-12 Thread dwysakowicz
This is an automated email from the ASF dual-hosted git repository.

dwysakowicz pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 1fbf92dfc9e [FLINK-33958] Fix IntervalJoin restore test flakiness
1fbf92dfc9e is described below

commit 1fbf92dfc9ee0e111d6ec740fe87fae27ef87d8b
Author: bvarghese1 
AuthorDate: Thu Jan 25 11:22:22 2024 -0800

[FLINK-33958] Fix IntervalJoin restore test flakiness

- Update input data to make test output predictable
---
 .../plan/nodes/exec/stream/IntervalJoinTestPrograms.java| 13 ++---
 1 file changed, 6 insertions(+), 7 deletions(-)

diff --git 
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IntervalJoinTestPrograms.java
 
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IntervalJoinTestPrograms.java
index 4e326af2430..6cc1c546beb 100644
--- 
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IntervalJoinTestPrograms.java
+++ 
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IntervalJoinTestPrograms.java
@@ -47,9 +47,9 @@ public class IntervalJoinTestPrograms {
 };
 
 static final Row[] SHIPMENT_AFTER_DATA = {
-Row.of(7, 3, "2020-04-15 08:00:16"),
-Row.of(11, 7, "2020-04-15 08:00:11"),
-Row.of(13, 10, "2020-04-15 08:00:13")
+Row.of(7, 3, "2020-04-15 08:00:15"),
+Row.of(11, 7, "2020-04-15 08:00:16"),
+Row.of(13, 10, "2020-04-15 08:00:16")
 };
 
 static final String[] ORDERS_EVENT_TIME_SCHEMA = {
@@ -102,8 +102,7 @@ public class IntervalJoinTestPrograms {
 "+I[2, 2020-04-15 08:00:02, 
2020-04-15 08:00:05]",
 "+I[5, 2020-04-15 08:00:05, 
2020-04-15 08:00:06]")
 .consumedAfterRestore(
-"+I[7, 2020-04-15 08:00:09, 
2020-04-15 08:00:11]",
-"+I[10, 2020-04-15 08:00:11, 
2020-04-15 08:00:13]")
+"+I[10, 2020-04-15 08:00:11, 
2020-04-15 08:00:16]")
 .build())
 .runSql(
 "INSERT INTO sink_t SELECT\n"
@@ -140,8 +139,8 @@ public class IntervalJoinTestPrograms {
 "+I[5, 2020-04-15 08:00:05, 
2020-04-15 08:00:06]",
 "+I[4, 2020-04-15 08:00:04, 
2020-04-15 08:00:15]")
 .consumedAfterRestore(
-"+I[7, 2020-04-15 08:00:09, 
2020-04-15 08:00:11]",
-"+I[10, 2020-04-15 08:00:11, 
2020-04-15 08:00:13]")
+"+I[7, 2020-04-15 08:00:09, 
2020-04-15 08:00:16]",
+"+I[10, 2020-04-15 08:00:11, 
2020-04-15 08:00:16]")
 .build())
 .runSql(
 "INSERT INTO sink_t SELECT\n"



(flink) branch release-1.19 updated: [FLINK-33958] Fix IntervalJoin restore test flakiness

2024-02-12 Thread dwysakowicz
This is an automated email from the ASF dual-hosted git repository.

dwysakowicz pushed a commit to branch release-1.19
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.19 by this push:
 new 04d3b1b1423 [FLINK-33958] Fix IntervalJoin restore test flakiness
04d3b1b1423 is described below

commit 04d3b1b1423676dc87c366841b1e521beb9953dc
Author: bvarghese1 
AuthorDate: Thu Jan 25 11:22:22 2024 -0800

[FLINK-33958] Fix IntervalJoin restore test flakiness

- Update input data to make test output predictable
---
 .../plan/nodes/exec/stream/IntervalJoinTestPrograms.java| 13 ++---
 1 file changed, 6 insertions(+), 7 deletions(-)

diff --git 
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IntervalJoinTestPrograms.java
 
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IntervalJoinTestPrograms.java
index 4e326af2430..6cc1c546beb 100644
--- 
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IntervalJoinTestPrograms.java
+++ 
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IntervalJoinTestPrograms.java
@@ -47,9 +47,9 @@ public class IntervalJoinTestPrograms {
 };
 
 static final Row[] SHIPMENT_AFTER_DATA = {
-Row.of(7, 3, "2020-04-15 08:00:16"),
-Row.of(11, 7, "2020-04-15 08:00:11"),
-Row.of(13, 10, "2020-04-15 08:00:13")
+Row.of(7, 3, "2020-04-15 08:00:15"),
+Row.of(11, 7, "2020-04-15 08:00:16"),
+Row.of(13, 10, "2020-04-15 08:00:16")
 };
 
 static final String[] ORDERS_EVENT_TIME_SCHEMA = {
@@ -102,8 +102,7 @@ public class IntervalJoinTestPrograms {
 "+I[2, 2020-04-15 08:00:02, 
2020-04-15 08:00:05]",
 "+I[5, 2020-04-15 08:00:05, 
2020-04-15 08:00:06]")
 .consumedAfterRestore(
-"+I[7, 2020-04-15 08:00:09, 
2020-04-15 08:00:11]",
-"+I[10, 2020-04-15 08:00:11, 
2020-04-15 08:00:13]")
+"+I[10, 2020-04-15 08:00:11, 
2020-04-15 08:00:16]")
 .build())
 .runSql(
 "INSERT INTO sink_t SELECT\n"
@@ -140,8 +139,8 @@ public class IntervalJoinTestPrograms {
 "+I[5, 2020-04-15 08:00:05, 
2020-04-15 08:00:06]",
 "+I[4, 2020-04-15 08:00:04, 
2020-04-15 08:00:15]")
 .consumedAfterRestore(
-"+I[7, 2020-04-15 08:00:09, 
2020-04-15 08:00:11]",
-"+I[10, 2020-04-15 08:00:11, 
2020-04-15 08:00:13]")
+"+I[7, 2020-04-15 08:00:09, 
2020-04-15 08:00:16]",
+"+I[10, 2020-04-15 08:00:11, 
2020-04-15 08:00:16]")
 .build())
 .runSql(
 "INSERT INTO sink_t SELECT\n"



svn commit: r67287 - /dev/flink/flink-connector-parent-1.1.0-rc1/

2024-02-12 Thread echauchot
Author: echauchot
Date: Mon Feb 12 09:05:38 2024
New Revision: 67287

Log:
[Revert] Delete 
https://dist.apache.org/repos/dist/dev/flink/flink-connector-parent-1.1.0-rc1

Removed:
dev/flink/flink-connector-parent-1.1.0-rc1/



(flink) branch master updated (1fbf92dfc9e -> 25a604a3a94)

2024-02-12 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


from 1fbf92dfc9e [FLINK-33958] Fix IntervalJoin restore test flakiness
 add 25a604a3a94 [hotfix][build] Wire up spotless.skip property

No new revisions were added by this update.

Summary of changes:
 pom.xml | 3 +++
 1 file changed, 3 insertions(+)



svn commit: r67292 - in /dev/flink/flink-connector-parent-1.1.0-rc1: ./ flink-connector-parent-1.1.0-src.tgz flink-connector-parent-1.1.0-src.tgz.asc flink-connector-parent-1.1.0-src.tgz.sha512

2024-02-12 Thread echauchot
Author: echauchot
Date: Mon Feb 12 10:05:44 2024
New Revision: 67292

Log:
Add flink-connector-parent-1.1.0-rc1

Added:
dev/flink/flink-connector-parent-1.1.0-rc1/

dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz 
  (with props)

dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.asc

dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.sha512

Added: 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz
==
Binary file - no diff available.

Propchange: 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz
--
svn:mime-type = application/octet-stream

Added: 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.asc
==
--- 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.asc
 (added)
+++ 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.asc
 Mon Feb 12 10:05:44 2024
@@ -0,0 +1,11 @@
+-BEGIN PGP SIGNATURE-
+
+iQEzBAABCgAdFiEE0adroZ1ilN0AM/aEOgGfC43RY+oFAmXJ7W4ACgkQOgGfC43R
+Y+pRngf9E5A/7MecaNhkarPwdCeYWDDDb1L2GVNTPHBQc/dYNZ7SkE+ex8uaD8WC
+vYkGTEeQQUyqDqK48HKzqPMtP9rxUFWY8w2nY81ULy9HUpPoQfVl9WHXMGtcUypc
+YCsFcETBkSVj0Wb38+5hiAqWYc8zjaplE5nR8d0W5VzcfmnOdrKfktDTCz9NfW2H
+UyVCLBFpJ+o1gefhVGgRIulUFWysf81AoH6EpkfSc+rH34iWaLaVpexRr9N6OuR2
+Rby+puTo96KrmQcM6oNxbYE/hsaLrPYbfQS+UPToJWqO16GAQKJatbrfNBgxJ7on
+cLYfJTckVvPsxUP4Us6Mauy4O63sCA==
+=dYWa
+-END PGP SIGNATURE-

Added: 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.sha512
==
--- 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.sha512
 (added)
+++ 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.sha512
 Mon Feb 12 10:05:44 2024
@@ -0,0 +1 @@
+797e39599d3e6d0c87432c49a8dec44eaddfb96d8075d390312ab8e2e54dfe5e34dd5a5d131f0b065b1fbb1bd24a1b7da82fc663f019ab1dae7af1bb07dbddff
  flink-connector-parent-1.1.0-src.tgz




svn commit: r67294 - /dev/flink/flink-connector-parent-1.1.0-rc1/

2024-02-12 Thread echauchot
Author: echauchot
Date: Mon Feb 12 10:28:12 2024
New Revision: 67294

Log:
[Revert] Delete 
https://dist.apache.org/repos/dist/dev/flink/flink-connector-parent-1.1.0-rc1

Removed:
dev/flink/flink-connector-parent-1.1.0-rc1/



(flink) branch release-1.18 updated: [FLINK-34422][test] BatchTestBase uses MiniClusterExtension

2024-02-12 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch release-1.18
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.18 by this push:
 new d69393678ef [FLINK-34422][test] BatchTestBase uses MiniClusterExtension
d69393678ef is described below

commit d69393678efe7e26bd5168407a1c862cd4a0e148
Author: Chesnay Schepler 
AuthorDate: Sun Feb 11 01:38:45 2024 +0100

[FLINK-34422][test] BatchTestBase uses MiniClusterExtension
---
 .../batch/sql/CompactManagedTableITCase.java   |  6 +++--
 .../planner/runtime/utils/BatchTestBase.scala  | 30 ++
 2 files changed, 24 insertions(+), 12 deletions(-)

diff --git 
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/runtime/batch/sql/CompactManagedTableITCase.java
 
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/runtime/batch/sql/CompactManagedTableITCase.java
index e5b263652a6..05c1b87654b 100644
--- 
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/runtime/batch/sql/CompactManagedTableITCase.java
+++ 
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/runtime/batch/sql/CompactManagedTableITCase.java
@@ -61,8 +61,7 @@ import static org.assertj.core.api.Assertions.fail;
 /** IT Case for testing managed table compaction. */
 public class CompactManagedTableITCase extends BatchTestBase {
 
-private final ObjectIdentifier tableIdentifier =
-ObjectIdentifier.of(tEnv().getCurrentCatalog(), 
tEnv().getCurrentDatabase(), "MyTable");
+private ObjectIdentifier tableIdentifier;
 private final Map> collectedElements = 
new HashMap<>();
 
 private Path rootPath;
@@ -73,6 +72,9 @@ public class CompactManagedTableITCase extends BatchTestBase {
 @Before
 public void before() throws Exception {
 super.before();
+tableIdentifier =
+ObjectIdentifier.of(
+tEnv().getCurrentCatalog(), 
tEnv().getCurrentDatabase(), "MyTable");
 MANAGED_TABLES.put(tableIdentifier, new AtomicReference<>());
 referenceOfManagedTableFileEntries = new AtomicReference<>();
 MANAGED_TABLE_FILE_ENTRIES.put(tableIdentifier, 
referenceOfManagedTableFileEntries);
diff --git 
a/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/planner/runtime/utils/BatchTestBase.scala
 
b/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/planner/runtime/utils/BatchTestBase.scala
index 0645473e57a..9496940db7a 100644
--- 
a/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/planner/runtime/utils/BatchTestBase.scala
+++ 
b/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/planner/runtime/utils/BatchTestBase.scala
@@ -53,20 +53,17 @@ import org.apache.calcite.runtime.CalciteContextException
 import org.apache.calcite.sql.SqlExplainLevel
 import org.apache.calcite.sql.parser.SqlParseException
 import org.assertj.core.api.Assertions.fail
+import org.junit.Before
 import org.junit.jupiter.api.{AfterEach, BeforeEach}
 
 class BatchTestBase extends BatchAbstractTestBase {
 
   protected var settings = 
EnvironmentSettings.newInstance().inBatchMode().build()
-  protected var testingTableEnv: TestingTableEnvironment = 
TestingTableEnvironment
-.create(settings, catalogManager = None, TableConfig.getDefault)
-  protected var tEnv: TableEnvironment = testingTableEnv
-  tEnv.getConfig.set(BatchExecutionOptions.ADAPTIVE_AUTO_PARALLELISM_ENABLED, 
Boolean.box(false))
-  protected var planner =
-
tEnv.asInstanceOf[TableEnvironmentImpl].getPlanner.asInstanceOf[PlannerBase]
-  protected var env: StreamExecutionEnvironment = planner.getExecEnv
-  env.getConfig.enableObjectReuse()
-  protected var tableConfig: TableConfig = tEnv.getConfig
+  protected var testingTableEnv: TestingTableEnvironment = _
+  protected var tEnv: TableEnvironment = _
+  protected var planner: PlannerBase = _
+  protected var env: StreamExecutionEnvironment = _
+  protected var tableConfig: TableConfig = _
 
   val LINE_COL_PATTERN: Pattern = Pattern.compile("At line ([0-9]+), column 
([0-9]+)")
   val LINE_COL_TWICE_PATTERN: Pattern = Pattern.compile(
@@ -75,10 +72,23 @@ class BatchTestBase extends BatchAbstractTestBase {
 
   @throws(classOf[Exception])
   @BeforeEach
-  def before(): Unit = {
+  @Before
+  def setupEnv(): Unit = {
+testingTableEnv = TestingTableEnvironment
+  .create(settings, catalogManager = None, TableConfig.getDefault)
+tEnv = testingTableEnv
+
tEnv.getConfig.set(BatchExecutionOptions.ADAPTIVE_AUTO_PARALLELISM_ENABLED, 
Boolean.box(false))
+planner = 
tEnv.asInstanceOf[TableEnvironmentImpl].getPlanner.asInstanceOf[PlannerBase]
+env = planner.getExecEnv
+env.getConfig.enableObjectReuse()
+tableConfig = tEnv.getConfig
 BatchTestBase.configForMiniCluster(tableCon

(flink) 01/03: [FLINK-34422] Migrate BatchTestBase subclass to jUnit5

2024-02-12 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 65727fb943807f1ff5345419ce389c5734df0cb4
Author: Chesnay Schepler 
AuthorDate: Sun Feb 11 16:06:59 2024 +0100

[FLINK-34422] Migrate BatchTestBase subclass to jUnit5

Usually this should've been done _before_ you ban jUnit4 annotations in 
BatchTestBase...
---
 .../java/org/apache/flink/formats/protobuf/ProtobufSQLITCaseTest.java   | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git 
a/flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCaseTest.java
 
b/flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCaseTest.java
index 1cc1a200d3a..ea869241ea2 100644
--- 
a/flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCaseTest.java
+++ 
b/flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCaseTest.java
@@ -27,7 +27,7 @@ import 
org.apache.flink.table.planner.runtime.utils.BatchTestBase;
 import org.apache.flink.types.Row;
 import org.apache.flink.util.CloseableIterator;
 
-import org.junit.Test;
+import org.junit.jupiter.api.Test;
 
 import java.util.Map;
 import java.util.concurrent.ExecutionException;



(flink) branch master updated (25a604a3a94 -> 9caa3bbb042)

2024-02-12 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


from 25a604a3a94 [hotfix][build] Wire up spotless.skip property
 new 65727fb9438 [FLINK-34422] Migrate BatchTestBase subclass to jUnit5
 new 4c4643c3251 [FLINK-34422][test] BatchTestBase uses MiniClusterExtension
 new 9caa3bbb042 [hotfix] Rename ProtobufSQLITCaseTest -> ProtobufSQLITCase

The 3 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 ...ufSQLITCaseTest.java => ProtobufSQLITCase.java} |  4 ++--
 .../batch/sql/CompactManagedTableITCase.java   |  6 +++--
 .../planner/runtime/utils/BatchTestBase.scala  | 28 ++
 3 files changed, 24 insertions(+), 14 deletions(-)
 rename 
flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/{ProtobufSQLITCaseTest.java
 => ProtobufSQLITCase.java} (99%)



(flink) 02/03: [FLINK-34422][test] BatchTestBase uses MiniClusterExtension

2024-02-12 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 4c4643c3251c284260c96a2110f4b78c8a369723
Author: Chesnay Schepler 
AuthorDate: Sun Feb 11 01:38:45 2024 +0100

[FLINK-34422][test] BatchTestBase uses MiniClusterExtension
---
 .../batch/sql/CompactManagedTableITCase.java   |  6 +++--
 .../planner/runtime/utils/BatchTestBase.scala  | 28 ++
 2 files changed, 22 insertions(+), 12 deletions(-)

diff --git 
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/runtime/batch/sql/CompactManagedTableITCase.java
 
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/runtime/batch/sql/CompactManagedTableITCase.java
index 4974b14feda..d4b3cbce27c 100644
--- 
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/runtime/batch/sql/CompactManagedTableITCase.java
+++ 
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/runtime/batch/sql/CompactManagedTableITCase.java
@@ -61,8 +61,7 @@ import static org.assertj.core.api.Assertions.fail;
 /** IT Case for testing managed table compaction. */
 class CompactManagedTableITCase extends BatchTestBase {
 
-private final ObjectIdentifier tableIdentifier =
-ObjectIdentifier.of(tEnv().getCurrentCatalog(), 
tEnv().getCurrentDatabase(), "MyTable");
+private ObjectIdentifier tableIdentifier;
 private final Map> collectedElements = 
new HashMap<>();
 
 private Path rootPath;
@@ -73,6 +72,9 @@ class CompactManagedTableITCase extends BatchTestBase {
 @BeforeEach
 public void before() throws Exception {
 super.before();
+tableIdentifier =
+ObjectIdentifier.of(
+tEnv().getCurrentCatalog(), 
tEnv().getCurrentDatabase(), "MyTable");
 MANAGED_TABLES.put(tableIdentifier, new AtomicReference<>());
 referenceOfManagedTableFileEntries = new AtomicReference<>();
 MANAGED_TABLE_FILE_ENTRIES.put(tableIdentifier, 
referenceOfManagedTableFileEntries);
diff --git 
a/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/planner/runtime/utils/BatchTestBase.scala
 
b/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/planner/runtime/utils/BatchTestBase.scala
index fb5a9a058ca..cb509321f34 100644
--- 
a/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/planner/runtime/utils/BatchTestBase.scala
+++ 
b/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/planner/runtime/utils/BatchTestBase.scala
@@ -57,15 +57,11 @@ import org.junit.jupiter.api.{AfterEach, BeforeEach}
 class BatchTestBase extends BatchAbstractTestBase {
 
   protected var settings = 
EnvironmentSettings.newInstance().inBatchMode().build()
-  protected var testingTableEnv: TestingTableEnvironment = 
TestingTableEnvironment
-.create(settings, catalogManager = None, TableConfig.getDefault)
-  protected var tEnv: TableEnvironment = testingTableEnv
-  tEnv.getConfig.set(BatchExecutionOptions.ADAPTIVE_AUTO_PARALLELISM_ENABLED, 
Boolean.box(false))
-  protected var planner =
-
tEnv.asInstanceOf[TableEnvironmentImpl].getPlanner.asInstanceOf[PlannerBase]
-  protected var env: StreamExecutionEnvironment = planner.getExecEnv
-  env.getConfig.enableObjectReuse()
-  protected var tableConfig: TableConfig = tEnv.getConfig
+  protected var testingTableEnv: TestingTableEnvironment = _
+  protected var tEnv: TableEnvironment = _
+  protected var planner: PlannerBase = _
+  protected var env: StreamExecutionEnvironment = _
+  protected var tableConfig: TableConfig = _
 
   val LINE_COL_PATTERN: Pattern = Pattern.compile("At line ([0-9]+), column 
([0-9]+)")
   val LINE_COL_TWICE_PATTERN: Pattern = Pattern.compile(
@@ -74,10 +70,22 @@ class BatchTestBase extends BatchAbstractTestBase {
 
   @throws(classOf[Exception])
   @BeforeEach
-  def before(): Unit = {
+  def setupEnv(): Unit = {
+testingTableEnv = TestingTableEnvironment
+  .create(settings, catalogManager = None, TableConfig.getDefault)
+tEnv = testingTableEnv
+
tEnv.getConfig.set(BatchExecutionOptions.ADAPTIVE_AUTO_PARALLELISM_ENABLED, 
Boolean.box(false))
+planner = 
tEnv.asInstanceOf[TableEnvironmentImpl].getPlanner.asInstanceOf[PlannerBase]
+env = planner.getExecEnv
+env.getConfig.enableObjectReuse()
+tableConfig = tEnv.getConfig
 BatchTestBase.configForMiniCluster(tableConfig)
   }
 
+  @throws(classOf[Exception])
+  @BeforeEach
+  def before(): Unit = {}
+
   @AfterEach
   def after(): Unit = {
 TestValuesTableFactory.clearAllData()



(flink) 03/03: [hotfix] Rename ProtobufSQLITCaseTest -> ProtobufSQLITCase

2024-02-12 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 9caa3bbb042901aabb84b33098c18af13e026a57
Author: Chesnay Schepler 
AuthorDate: Mon Feb 12 11:50:39 2024 +0100

[hotfix] Rename ProtobufSQLITCaseTest -> ProtobufSQLITCase
---
 .../protobuf/{ProtobufSQLITCaseTest.java => ProtobufSQLITCase.java} | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git 
a/flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCaseTest.java
 
b/flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCase.java
similarity index 99%
rename from 
flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCaseTest.java
rename to 
flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCase.java
index ea869241ea2..422574115e8 100644
--- 
a/flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCaseTest.java
+++ 
b/flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCase.java
@@ -40,7 +40,7 @@ import static org.junit.Assert.assertNull;
 import static org.junit.Assert.fail;
 
 /** Integration SQL test for protobuf. */
-public class ProtobufSQLITCaseTest extends BatchTestBase {
+public class ProtobufSQLITCase extends BatchTestBase {
 
 private MapTest getProtoTestObject() {
 MapTest.InnerMessageTest innerMessageTest =



(flink-docker) branch master updated (1a356e9 -> 578731b)

2024-02-12 Thread mapohl
This is an automated email from the ASF dual-hosted git repository.

mapohl pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink-docker.git


from 1a356e9  [HOTFIX][Doc] update readme to clarify that 
generate-stackbrew-library-docker.sh should be run on local machine directly.
 new 2c169b6  [FLINK-34282][docker] Updates snapshot workflow to consider 
the dev-1.19 branch and new dev-master.
 new c7ba1ce  [hotfix][release] Removes 1.15 and 1.16 as part of the 1.19.0 
release preparations
 new 578731b  [hotfix][docs] Adds comment on Docker image version removal

The 3 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .github/workflows/snapshot.yml |   6 +-
 1.15/scala_2.12-java11-ubuntu/Dockerfile   |  94 -
 1.15/scala_2.12-java11-ubuntu/docker-entrypoint.sh | 152 -
 1.15/scala_2.12-java11-ubuntu/release.metadata |   2 -
 1.15/scala_2.12-java8-ubuntu/Dockerfile|  94 -
 1.15/scala_2.12-java8-ubuntu/docker-entrypoint.sh  | 152 -
 1.15/scala_2.12-java8-ubuntu/release.metadata  |   2 -
 1.16/scala_2.12-java11-ubuntu/Dockerfile   |  94 -
 1.16/scala_2.12-java11-ubuntu/docker-entrypoint.sh | 152 -
 1.16/scala_2.12-java11-ubuntu/release.metadata |   2 -
 1.16/scala_2.12-java8-ubuntu/Dockerfile|  94 -
 1.16/scala_2.12-java8-ubuntu/docker-entrypoint.sh  | 152 -
 1.16/scala_2.12-java8-ubuntu/release.metadata  |   2 -
 README.md  |   1 +
 14 files changed, 4 insertions(+), 995 deletions(-)
 delete mode 100644 1.15/scala_2.12-java11-ubuntu/Dockerfile
 delete mode 100755 1.15/scala_2.12-java11-ubuntu/docker-entrypoint.sh
 delete mode 100644 1.15/scala_2.12-java11-ubuntu/release.metadata
 delete mode 100644 1.15/scala_2.12-java8-ubuntu/Dockerfile
 delete mode 100755 1.15/scala_2.12-java8-ubuntu/docker-entrypoint.sh
 delete mode 100644 1.15/scala_2.12-java8-ubuntu/release.metadata
 delete mode 100644 1.16/scala_2.12-java11-ubuntu/Dockerfile
 delete mode 100755 1.16/scala_2.12-java11-ubuntu/docker-entrypoint.sh
 delete mode 100644 1.16/scala_2.12-java11-ubuntu/release.metadata
 delete mode 100644 1.16/scala_2.12-java8-ubuntu/Dockerfile
 delete mode 100755 1.16/scala_2.12-java8-ubuntu/docker-entrypoint.sh
 delete mode 100644 1.16/scala_2.12-java8-ubuntu/release.metadata



(flink-docker) 02/03: [hotfix][release] Removes 1.15 and 1.16 as part of the 1.19.0 release preparations

2024-02-12 Thread mapohl
This is an automated email from the ASF dual-hosted git repository.

mapohl pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-docker.git

commit c7ba1ce058a860a23675910ba42b50f744c8c2e4
Author: Matthias Pohl 
AuthorDate: Fri Feb 9 12:10:07 2024 +0100

[hotfix][release] Removes 1.15 and 1.16 as part of the 1.19.0 release
preparations

- 1.15 and 1.16 are already deprecated
- 1.17 is kept because we still might do a final patch release as part
  of the 1.19.0 release
---
 .github/workflows/snapshot.yml |   2 -
 1.15/scala_2.12-java11-ubuntu/Dockerfile   |  94 -
 1.15/scala_2.12-java11-ubuntu/docker-entrypoint.sh | 152 -
 1.15/scala_2.12-java11-ubuntu/release.metadata |   2 -
 1.15/scala_2.12-java8-ubuntu/Dockerfile|  94 -
 1.15/scala_2.12-java8-ubuntu/docker-entrypoint.sh  | 152 -
 1.15/scala_2.12-java8-ubuntu/release.metadata  |   2 -
 1.16/scala_2.12-java11-ubuntu/Dockerfile   |  94 -
 1.16/scala_2.12-java11-ubuntu/docker-entrypoint.sh | 152 -
 1.16/scala_2.12-java11-ubuntu/release.metadata |   2 -
 1.16/scala_2.12-java8-ubuntu/Dockerfile|  94 -
 1.16/scala_2.12-java8-ubuntu/docker-entrypoint.sh  | 152 -
 1.16/scala_2.12-java8-ubuntu/release.metadata  |   2 -
 13 files changed, 994 deletions(-)

diff --git a/.github/workflows/snapshot.yml b/.github/workflows/snapshot.yml
index 3d66cd7..2389cd1 100644
--- a/.github/workflows/snapshot.yml
+++ b/.github/workflows/snapshot.yml
@@ -46,8 +46,6 @@ jobs:
 branch: dev-1.18
   - flink_version: 1.17-SNAPSHOT
 branch: dev-1.17
-  - flink_version: 1.16-SNAPSHOT
-branch: dev-1.16
 steps:
   - uses: actions/checkout@v3
 with:
diff --git a/1.15/scala_2.12-java11-ubuntu/Dockerfile 
b/1.15/scala_2.12-java11-ubuntu/Dockerfile
deleted file mode 100644
index 2f38d26..000
--- a/1.15/scala_2.12-java11-ubuntu/Dockerfile
+++ /dev/null
@@ -1,94 +0,0 @@
-###
-#  Licensed to the Apache Software Foundation (ASF) under one
-#  or more contributor license agreements.  See the NOTICE file
-#  distributed with this work for additional information
-#  regarding copyright ownership.  The ASF licenses this file
-#  to you under the Apache License, Version 2.0 (the
-#  "License"); you may not use this file except in compliance
-#  with the License.  You may obtain a copy of the License at
-#
-#  http://www.apache.org/licenses/LICENSE-2.0
-#
-#  Unless required by applicable law or agreed to in writing, software
-#  distributed under the License is distributed on an "AS IS" BASIS,
-#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-#  See the License for the specific language governing permissions and
-# limitations under the License.
-###
-
-FROM eclipse-temurin:11-jre-jammy
-
-# Install dependencies
-RUN set -ex; \
-  apt-get update; \
-  apt-get -y install gpg libsnappy1v5 gettext-base libjemalloc-dev; \
-  rm -rf /var/lib/apt/lists/*
-
-# Grab gosu for easy step-down from root
-ENV GOSU_VERSION 1.11
-RUN set -ex; \
-  wget -nv -O /usr/local/bin/gosu 
"https://github.com/tianon/gosu/releases/download/$GOSU_VERSION/gosu-$(dpkg 
--print-architecture)"; \
-  wget -nv -O /usr/local/bin/gosu.asc 
"https://github.com/tianon/gosu/releases/download/$GOSU_VERSION/gosu-$(dpkg 
--print-architecture).asc"; \
-  export GNUPGHOME="$(mktemp -d)"; \
-  for server in ha.pool.sks-keyservers.net $(shuf -e \
-  hkp://p80.pool.sks-keyservers.net:80 \
-  keyserver.ubuntu.com \
-  hkp://keyserver.ubuntu.com:80 \
-  pgp.mit.edu) ; do \
-  gpg --batch --keyserver "$server" --recv-keys 
B42F6819007F00F88E364FD4036A9C25BF357DD4 && break || : ; \
-  done && \
-  gpg --batch --verify /usr/local/bin/gosu.asc /usr/local/bin/gosu; \
-  gpgconf --kill all; \
-  rm -rf "$GNUPGHOME" /usr/local/bin/gosu.asc; \
-  chmod +x /usr/local/bin/gosu; \
-  gosu nobody true
-
-# Configure Flink version
-ENV 
FLINK_TGZ_URL=https://www.apache.org/dyn/closer.cgi?action=download&filename=flink/flink-1.15.4/flink-1.15.4-bin-scala_2.12.tgz
 \
-
FLINK_ASC_URL=https://downloads.apache.org/flink/flink-1.15.4/flink-1.15.4-bin-scala_2.12.tgz.asc
 \
-GPG_KEY=0F79F2AFB2351BC29678544591F9C1EC125FD8DB \
-CHECK_GPG=true
-
-# Prepare environment
-ENV FLINK_HOME=/opt/flink
-ENV PATH=$FLINK_HOME/bin:$PATH
-RUN groupadd --system --gid= flink && \
-useradd --system --home-dir $FLINK_HOME --uid= --gid=flink flink
-WORKDIR $FLINK_HOME
-
-# Install Flink
-RUN set -ex; \
-  wget -nv -O flink.tgz "$FLINK_TGZ_URL"; \
-  \
-  if [

(flink-docker) 03/03: [hotfix][docs] Adds comment on Docker image version removal

2024-02-12 Thread mapohl
This is an automated email from the ASF dual-hosted git repository.

mapohl pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-docker.git

commit 578731b7a507d765c554efbfb7c6535976862d2d
Author: Matthias Pohl 
AuthorDate: Fri Feb 9 15:29:56 2024 +0100

[hotfix][docs] Adds comment on Docker image version removal
---
 README.md | 1 +
 1 file changed, 1 insertion(+)

diff --git a/README.md b/README.md
index 263bc66..8b0af05 100644
--- a/README.md
+++ b/README.md
@@ -83,6 +83,7 @@ Updating the Dockerfiles involves the following steps:
 2. Update Dockerfiles on the the `master` branch
 * Remove any existing Dockerfiles from the same major version
 * e.g. `rm -r 1.2`, if the new Flink version is `1.2.1`
+* Remove any not-supported versions (i.e. versions that are not meant to 
be released anymore)
 * Copy the generated Dockerfiles from the `dev-x.y`/`dev-master` branch to 
`master`
 * Commit the changes with message `Update Dockerfiles for x.y.z release` 
\[[example](
   
https://github.com/apache/flink-docker/commit/5920fd775ca1a8d03ee959d79bceeb5d6e8f35a1)]



(flink-docker) 01/03: [FLINK-34282][docker] Updates snapshot workflow to consider the dev-1.19 branch and new dev-master.

2024-02-12 Thread mapohl
This is an automated email from the ASF dual-hosted git repository.

mapohl pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-docker.git

commit 2c169b6a83bf83bbe997ed35aaf548de10050b58
Author: Matthias Pohl 
AuthorDate: Fri Feb 9 12:08:21 2024 +0100

[FLINK-34282][docker] Updates snapshot workflow to consider the dev-1.19 
branch and new dev-master.
---
 .github/workflows/snapshot.yml | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)

diff --git a/.github/workflows/snapshot.yml b/.github/workflows/snapshot.yml
index 1d7d53a..3d66cd7 100644
--- a/.github/workflows/snapshot.yml
+++ b/.github/workflows/snapshot.yml
@@ -38,8 +38,10 @@ jobs:
   matrix:
 java_version: [8, 11]
 build:
-  - flink_version: 1.19-SNAPSHOT
+  - flink_version: 1.20-SNAPSHOT
 branch: dev-master
+  - flink_version: 1.19-SNAPSHOT
+branch: dev-1.19
   - flink_version: 1.18-SNAPSHOT
 branch: dev-1.18
   - flink_version: 1.17-SNAPSHOT



(flink-connector-shared-utils) annotated tag v1.1.0-rc1 deleted (was 53765e1)

2024-02-12 Thread echauchot
This is an automated email from the ASF dual-hosted git repository.

echauchot pushed a change to annotated tag v1.1.0-rc1
in repository 
https://gitbox.apache.org/repos/asf/flink-connector-shared-utils.git


*** WARNING: tag v1.1.0-rc1 was deleted! ***

   tag was  53765e1

This change permanently discards the following revisions:

 discard 08e4ef3  Update version to 1.1.0



(flink-connector-shared-utils) annotated tag v1.1.0-rc1 updated (2c911d3 -> af772f3)

2024-02-12 Thread echauchot
This is an automated email from the ASF dual-hosted git repository.

echauchot pushed a change to annotated tag v1.1.0-rc1
in repository 
https://gitbox.apache.org/repos/asf/flink-connector-shared-utils.git


*** WARNING: tag v1.1.0-rc1 was modified! ***

from 2c911d3  (commit)
  to af772f3  (tag)
 tagging 2c911d309dedefa618c9fad1844cf10b048029e3 (commit)
 replaces v1.0.0
  by Etienne Chauchot
  on Mon Feb 12 15:21:55 2024 +0100

- Log -
v1.1.0-rc1
-BEGIN PGP SIGNATURE-

iQEzBAABCgAdFiEE0adroZ1ilN0AM/aEOgGfC43RY+oFAmXKKYMACgkQOgGfC43R
Y+r7gAgA2mhQfSSyZA5qS4nfCJn0/MOTUDVw9+nwCKS/Bie5FB75c6KbcUuYIJ6U
hbh1aMIvci40mxnNoFIeWCfLyAkzOVCAOOi39U8bmYZhru3HDSkG7pj5qeZFZCcj
qfvSc6neOa5AGbqExSFoLZzCoyPx97e0fUxW7fv8TeRzYEOo2o1f5YBGeZEXzF8T
t/cDSR+xKzzSnhD+qF1sqc6EpIEqcF+SPyd/1Z9VxCk7jESzawWhIMbmVogsvh0G
k2ma5pxO9OexQ3hbEIkMg/MkkIxEK3RtLKG4DDtuXPd6TQcyIzQwiAgge9evDtJf
ntiq4hp07itSvOd9bV5PZg4HhSmi8w==
=50Nz
-END PGP SIGNATURE-
---


No new revisions were added by this update.

Summary of changes:



svn commit: r67304 - in /dev/flink/flink-connector-parent-1.1.0-rc1: ./ flink-connector-parent-1.1.0-src.tgz flink-connector-parent-1.1.0-src.tgz.asc flink-connector-parent-1.1.0-src.tgz.sha512

2024-02-12 Thread echauchot
Author: echauchot
Date: Mon Feb 12 15:45:10 2024
New Revision: 67304

Log:
Add flink-connector-parent-1.1.0-rc1

Added:
dev/flink/flink-connector-parent-1.1.0-rc1/

dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz 
  (with props)

dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.asc

dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.sha512

Added: 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz
==
Binary file - no diff available.

Propchange: 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz
--
svn:mime-type = application/octet-stream

Added: 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.asc
==
--- 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.asc
 (added)
+++ 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.asc
 Mon Feb 12 15:45:10 2024
@@ -0,0 +1,11 @@
+-BEGIN PGP SIGNATURE-
+
+iQEzBAABCgAdFiEE0adroZ1ilN0AM/aEOgGfC43RY+oFAmXKPPwACgkQOgGfC43R
+Y+rKEQf/SGt886IO3YkJwwzAErzLihsDjhavIwc8aUo3ceW/yRYi7ye7GL1Gc8/J
+aXS955ZypX63tmIwHz+SZDowr6MKOgXL3BUHCdqtWNZb+M/rpwXORwK4xgiKSNBh
+DiQc0q7iFk9NjnCUf2oOnSFoiGZ2FZBSEFQizlsu2NRTR6J9/eRKK8F/72Obadch
+N96AsieoWuyiRMAFd3JyQhgqzVi1D2x99JMk6HP78WvKVLUChhJ+3C5l3r6nDq/G
+5jDwYgrdYmXOzGikGtjWdLxlBFXNSvcqrtqu0w3gehELXiUSaP1+PWsL8D4Mr7z2
+dOPdKjsywpWkOdEMARyzS3SeCBHxfw==
+=SG/n
+-END PGP SIGNATURE-

Added: 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.sha512
==
--- 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.sha512
 (added)
+++ 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.sha512
 Mon Feb 12 15:45:10 2024
@@ -0,0 +1 @@
+986d34a923ed596acc90f12c29c26a0232378f9c1f4339f3e74a1918d8cb0f7a6499514c6a90209baab31f07342a18023c7504c3fff36fda9ac5698eca269c6e
  flink-connector-parent-1.1.0-src.tgz




svn commit: r67305 - /dev/flink/flink-connector-parent-1.1.0-rc1/

2024-02-12 Thread echauchot
Author: echauchot
Date: Mon Feb 12 15:46:58 2024
New Revision: 67305

Log:
[Revert] Delete 
https://dist.apache.org/repos/dist/dev/flink/flink-connector-parent-1.1.0-rc1

Removed:
dev/flink/flink-connector-parent-1.1.0-rc1/



svn commit: r67307 - in /dev/flink/flink-connector-parent-1.1.0-rc1: ./ flink-connector-parent-1.1.0-src.tgz flink-connector-parent-1.1.0-src.tgz.asc flink-connector-parent-1.1.0-src.tgz.sha512

2024-02-12 Thread echauchot
Author: echauchot
Date: Mon Feb 12 15:59:30 2024
New Revision: 67307

Log:
Add flink-connector-parent-1.1.0-rc1

Added:
dev/flink/flink-connector-parent-1.1.0-rc1/

dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz 
  (with props)

dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.asc

dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.sha512

Added: 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz
==
Binary file - no diff available.

Propchange: 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz
--
svn:mime-type = application/octet-stream

Added: 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.asc
==
--- 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.asc
 (added)
+++ 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.asc
 Mon Feb 12 15:59:30 2024
@@ -0,0 +1,11 @@
+-BEGIN PGP SIGNATURE-
+
+iQEzBAABCgAdFiEE0adroZ1ilN0AM/aEOgGfC43RY+oFAmXKQF4ACgkQOgGfC43R
+Y+qIxwgAi2v26AnxQ+1/0kZnljqczyTrB/jBhc7taF9xrOS5i8MFk0tHgot7V/dy
+unvPJb8BxkytRw/MqToHdjK3jHd0rk4zpUAeEm4/ZU5wNtSpdJmGwG34C75og96Z
+n/cw3TA/sCL0/Ovmz/YxPvtYnEilVIW4QwnPEbQb5Yw5Z8H41k2pBV0imFCblMaW
+JtXIu6XFohZ6DMIpV/exjG+MVA3CusnpCpdOLH4QKsQE7LPAv5MHX8aJ2p3Svj4Q
+3LkhY1lQyDxqCBKduTljD81Vi5pCAPrZgDj8iM8wRuCqN+Coc5Sl1oRekzYA43Lg
+IQapJgw81PR7rXSA8D2LgEm5+3Pskg==
+=esZa
+-END PGP SIGNATURE-

Added: 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.sha512
==
--- 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.sha512
 (added)
+++ 
dev/flink/flink-connector-parent-1.1.0-rc1/flink-connector-parent-1.1.0-src.tgz.sha512
 Mon Feb 12 15:59:30 2024
@@ -0,0 +1 @@
+80d929e034a35e81195a61b0fcdebb61c03a22d49b3ea6f72b304fcd05c0e3801668d146f2b1e397de6caa910123eb4fbdcfaeb1497d9ad9d7cbadeda57a
  flink-connector-parent-1.1.0-src.tgz




(flink) branch release-1.19 updated (04d3b1b1423 -> e7ac9887f92)

2024-02-12 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a change to branch release-1.19
in repository https://gitbox.apache.org/repos/asf/flink.git


from 04d3b1b1423 [FLINK-33958] Fix IntervalJoin restore test flakiness
 new 994850d33a3 [FLINK-34422] Migrate BatchTestBase subclass to jUnit5
 new 3fcbe3df489 [FLINK-34422][test] BatchTestBase uses MiniClusterExtension
 new e7ac9887f92 [hotfix] Rename ProtobufSQLITCaseTest -> ProtobufSQLITCase

The 3 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 ...ufSQLITCaseTest.java => ProtobufSQLITCase.java} |  4 ++--
 .../batch/sql/CompactManagedTableITCase.java   |  6 +++--
 .../planner/runtime/utils/BatchTestBase.scala  | 28 ++
 3 files changed, 24 insertions(+), 14 deletions(-)
 rename 
flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/{ProtobufSQLITCaseTest.java
 => ProtobufSQLITCase.java} (99%)



(flink) 02/03: [FLINK-34422][test] BatchTestBase uses MiniClusterExtension

2024-02-12 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch release-1.19
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 3fcbe3df48904d10ae29a35800474b18af9e7172
Author: Chesnay Schepler 
AuthorDate: Sun Feb 11 01:38:45 2024 +0100

[FLINK-34422][test] BatchTestBase uses MiniClusterExtension
---
 .../batch/sql/CompactManagedTableITCase.java   |  6 +++--
 .../planner/runtime/utils/BatchTestBase.scala  | 28 ++
 2 files changed, 22 insertions(+), 12 deletions(-)

diff --git 
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/runtime/batch/sql/CompactManagedTableITCase.java
 
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/runtime/batch/sql/CompactManagedTableITCase.java
index 4974b14feda..d4b3cbce27c 100644
--- 
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/runtime/batch/sql/CompactManagedTableITCase.java
+++ 
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/runtime/batch/sql/CompactManagedTableITCase.java
@@ -61,8 +61,7 @@ import static org.assertj.core.api.Assertions.fail;
 /** IT Case for testing managed table compaction. */
 class CompactManagedTableITCase extends BatchTestBase {
 
-private final ObjectIdentifier tableIdentifier =
-ObjectIdentifier.of(tEnv().getCurrentCatalog(), 
tEnv().getCurrentDatabase(), "MyTable");
+private ObjectIdentifier tableIdentifier;
 private final Map> collectedElements = 
new HashMap<>();
 
 private Path rootPath;
@@ -73,6 +72,9 @@ class CompactManagedTableITCase extends BatchTestBase {
 @BeforeEach
 public void before() throws Exception {
 super.before();
+tableIdentifier =
+ObjectIdentifier.of(
+tEnv().getCurrentCatalog(), 
tEnv().getCurrentDatabase(), "MyTable");
 MANAGED_TABLES.put(tableIdentifier, new AtomicReference<>());
 referenceOfManagedTableFileEntries = new AtomicReference<>();
 MANAGED_TABLE_FILE_ENTRIES.put(tableIdentifier, 
referenceOfManagedTableFileEntries);
diff --git 
a/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/planner/runtime/utils/BatchTestBase.scala
 
b/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/planner/runtime/utils/BatchTestBase.scala
index fb5a9a058ca..cb509321f34 100644
--- 
a/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/planner/runtime/utils/BatchTestBase.scala
+++ 
b/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/planner/runtime/utils/BatchTestBase.scala
@@ -57,15 +57,11 @@ import org.junit.jupiter.api.{AfterEach, BeforeEach}
 class BatchTestBase extends BatchAbstractTestBase {
 
   protected var settings = 
EnvironmentSettings.newInstance().inBatchMode().build()
-  protected var testingTableEnv: TestingTableEnvironment = 
TestingTableEnvironment
-.create(settings, catalogManager = None, TableConfig.getDefault)
-  protected var tEnv: TableEnvironment = testingTableEnv
-  tEnv.getConfig.set(BatchExecutionOptions.ADAPTIVE_AUTO_PARALLELISM_ENABLED, 
Boolean.box(false))
-  protected var planner =
-
tEnv.asInstanceOf[TableEnvironmentImpl].getPlanner.asInstanceOf[PlannerBase]
-  protected var env: StreamExecutionEnvironment = planner.getExecEnv
-  env.getConfig.enableObjectReuse()
-  protected var tableConfig: TableConfig = tEnv.getConfig
+  protected var testingTableEnv: TestingTableEnvironment = _
+  protected var tEnv: TableEnvironment = _
+  protected var planner: PlannerBase = _
+  protected var env: StreamExecutionEnvironment = _
+  protected var tableConfig: TableConfig = _
 
   val LINE_COL_PATTERN: Pattern = Pattern.compile("At line ([0-9]+), column 
([0-9]+)")
   val LINE_COL_TWICE_PATTERN: Pattern = Pattern.compile(
@@ -74,10 +70,22 @@ class BatchTestBase extends BatchAbstractTestBase {
 
   @throws(classOf[Exception])
   @BeforeEach
-  def before(): Unit = {
+  def setupEnv(): Unit = {
+testingTableEnv = TestingTableEnvironment
+  .create(settings, catalogManager = None, TableConfig.getDefault)
+tEnv = testingTableEnv
+
tEnv.getConfig.set(BatchExecutionOptions.ADAPTIVE_AUTO_PARALLELISM_ENABLED, 
Boolean.box(false))
+planner = 
tEnv.asInstanceOf[TableEnvironmentImpl].getPlanner.asInstanceOf[PlannerBase]
+env = planner.getExecEnv
+env.getConfig.enableObjectReuse()
+tableConfig = tEnv.getConfig
 BatchTestBase.configForMiniCluster(tableConfig)
   }
 
+  @throws(classOf[Exception])
+  @BeforeEach
+  def before(): Unit = {}
+
   @AfterEach
   def after(): Unit = {
 TestValuesTableFactory.clearAllData()



(flink) 03/03: [hotfix] Rename ProtobufSQLITCaseTest -> ProtobufSQLITCase

2024-02-12 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch release-1.19
in repository https://gitbox.apache.org/repos/asf/flink.git

commit e7ac9887f92c670d8584457c4b83948409ec67fe
Author: Chesnay Schepler 
AuthorDate: Mon Feb 12 11:50:39 2024 +0100

[hotfix] Rename ProtobufSQLITCaseTest -> ProtobufSQLITCase
---
 .../protobuf/{ProtobufSQLITCaseTest.java => ProtobufSQLITCase.java} | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git 
a/flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCaseTest.java
 
b/flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCase.java
similarity index 99%
rename from 
flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCaseTest.java
rename to 
flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCase.java
index ea869241ea2..422574115e8 100644
--- 
a/flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCaseTest.java
+++ 
b/flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCase.java
@@ -40,7 +40,7 @@ import static org.junit.Assert.assertNull;
 import static org.junit.Assert.fail;
 
 /** Integration SQL test for protobuf. */
-public class ProtobufSQLITCaseTest extends BatchTestBase {
+public class ProtobufSQLITCase extends BatchTestBase {
 
 private MapTest getProtoTestObject() {
 MapTest.InnerMessageTest innerMessageTest =



(flink) 01/03: [FLINK-34422] Migrate BatchTestBase subclass to jUnit5

2024-02-12 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch release-1.19
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 994850d33a32f1ac27cee755f976b86208f911e3
Author: Chesnay Schepler 
AuthorDate: Sun Feb 11 16:06:59 2024 +0100

[FLINK-34422] Migrate BatchTestBase subclass to jUnit5

Usually this should've been done _before_ you ban jUnit4 annotations in 
BatchTestBase...
---
 .../java/org/apache/flink/formats/protobuf/ProtobufSQLITCaseTest.java   | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git 
a/flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCaseTest.java
 
b/flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCaseTest.java
index 1cc1a200d3a..ea869241ea2 100644
--- 
a/flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCaseTest.java
+++ 
b/flink-formats/flink-protobuf/src/test/java/org/apache/flink/formats/protobuf/ProtobufSQLITCaseTest.java
@@ -27,7 +27,7 @@ import 
org.apache.flink.table.planner.runtime.utils.BatchTestBase;
 import org.apache.flink.types.Row;
 import org.apache.flink.util.CloseableIterator;
 
-import org.junit.Test;
+import org.junit.jupiter.api.Test;
 
 import java.util.Map;
 import java.util.concurrent.ExecutionException;



(flink) branch master updated (9caa3bbb042 -> 6f74889cbb5)

2024-02-12 Thread snuyanzin
This is an automated email from the ASF dual-hosted git repository.

snuyanzin pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


from 9caa3bbb042 [hotfix] Rename ProtobufSQLITCaseTest -> ProtobufSQLITCase
 add 6f74889cbb5 [FLINK-34157][table] Migrate FlinkLimit0RemoveRule to java

No new revisions were added by this update.

Summary of changes:
 .../plan/rules/logical/FlinkLimit0RemoveRule.java  | 70 ++
 .../plan/rules/logical/FlinkLimit0RemoveRule.scala | 46 --
 2 files changed, 70 insertions(+), 46 deletions(-)
 create mode 100644 
flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/rules/logical/FlinkLimit0RemoveRule.java
 delete mode 100644 
flink-table/flink-table-planner/src/main/scala/org/apache/flink/table/planner/plan/rules/logical/FlinkLimit0RemoveRule.scala



(flink-connector-kafka) 01/02: [FLINK-34192] Update to be compatible with updated SinkV2 interfaces

2024-02-12 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git

commit b8328ab55e2bcf026ef82e35cebbb1d867cfb18f
Author: Jiabao Sun 
AuthorDate: Thu Feb 8 23:16:44 2024 +0800

[FLINK-34192] Update to be compatible with updated SinkV2 interfaces
---
 .github/workflows/push_pr.yml  |   2 +
 flink-connector-kafka/pom.xml  |   4 +
 .../connector/kafka/sink/KafkaWriterITCase.java| 149 ++---
 .../kafka/table/KafkaTableTestUtils.java   |  16 ++-
 4 files changed, 91 insertions(+), 80 deletions(-)

diff --git a/.github/workflows/push_pr.yml b/.github/workflows/push_pr.yml
index d57c0181..00e2f788 100644
--- a/.github/workflows/push_pr.yml
+++ b/.github/workflows/push_pr.yml
@@ -30,6 +30,8 @@ jobs:
 include:
   - flink: 1.18.1
 jdk: '8, 11, 17'
+  - flink: 1.19-SNAPSHOT
+jdk: '8, 11, 17, 21'
 uses: apache/flink-connector-shared-utils/.github/workflows/ci.yml@ci_utils
 with:
   flink_version: ${{ matrix.flink }}
diff --git a/flink-connector-kafka/pom.xml b/flink-connector-kafka/pom.xml
index 40d6a9f3..6510b9c8 100644
--- a/flink-connector-kafka/pom.xml
+++ b/flink-connector-kafka/pom.xml
@@ -144,6 +144,10 @@ under the License.
 org.slf4j
 slf4j-api
 
+
+io.dropwizard.metrics
+metrics-core
+
 
 test
 
diff --git 
a/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java
 
b/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java
index 41c26633..c9eceb98 100644
--- 
a/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java
+++ 
b/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java
@@ -27,9 +27,11 @@ import org.apache.flink.metrics.Counter;
 import org.apache.flink.metrics.Gauge;
 import org.apache.flink.metrics.MetricGroup;
 import org.apache.flink.metrics.groups.OperatorIOMetricGroup;
+import org.apache.flink.metrics.groups.OperatorMetricGroup;
 import org.apache.flink.metrics.groups.SinkWriterMetricGroup;
 import org.apache.flink.metrics.testutils.MetricListener;
 import org.apache.flink.runtime.metrics.groups.InternalSinkWriterMetricGroup;
+import org.apache.flink.runtime.metrics.groups.ProxyMetricGroup;
 import org.apache.flink.runtime.metrics.groups.UnregisteredMetricGroups;
 import org.apache.flink.util.TestLoggerExtension;
 import org.apache.flink.util.UserCodeClassLoader;
@@ -58,7 +60,6 @@ import java.io.IOException;
 import java.nio.ByteBuffer;
 import java.util.ArrayList;
 import java.util.Collection;
-import java.util.Collections;
 import java.util.Comparator;
 import java.util.List;
 import java.util.Optional;
@@ -84,7 +85,7 @@ public class KafkaWriterITCase {
 private static final Network NETWORK = Network.newNetwork();
 private static final String KAFKA_METRIC_WITH_GROUP_NAME = 
"KafkaProducer.incoming-byte-total";
 private static final SinkWriter.Context SINK_WRITER_CONTEXT = new 
DummySinkWriterContext();
-private String topic;
+private static String topic;
 
 private MetricListener metricListener;
 private TriggerTimeService timeService;
@@ -130,11 +131,8 @@ public class KafkaWriterITCase {
 
 @Test
 public void testIncreasingRecordBasedCounters() throws Exception {
-final OperatorIOMetricGroup operatorIOMetricGroup =
-
UnregisteredMetricGroups.createUnregisteredOperatorMetricGroup().getIOMetricGroup();
-final InternalSinkWriterMetricGroup metricGroup =
-InternalSinkWriterMetricGroup.mock(
-metricListener.getMetricGroup(), 
operatorIOMetricGroup);
+final SinkWriterMetricGroup metricGroup = 
createSinkWriterMetricGroup();
+
 try (final KafkaWriter writer =
 createWriterWithConfiguration(
 getKafkaClientConfiguration(), DeliveryGuarantee.NONE, 
metricGroup)) {
@@ -167,13 +165,9 @@ public class KafkaWriterITCase {
 
 @Test
 public void testCurrentSendTimeMetric() throws Exception {
-final InternalSinkWriterMetricGroup metricGroup =
-
InternalSinkWriterMetricGroup.mock(metricListener.getMetricGroup());
 try (final KafkaWriter writer =
 createWriterWithConfiguration(
-getKafkaClientConfiguration(),
-DeliveryGuarantee.AT_LEAST_ONCE,
-metricGroup)) {
+getKafkaClientConfiguration(), 
DeliveryGuarantee.AT_LEAST_ONCE)) {
 final Optional> currentSendTime =
 metricListener.getGauge("currentSendTim

(flink-connector-kafka) 02/02: [FLINK-34193] Remove usage of Flink-Shaded Jackson and Snakeyaml in flink-connector-kafka

2024-02-12 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git

commit 2606a8256d0e25da19ebce4f92cd426b5bf63f7c
Author: Jiabao Sun 
AuthorDate: Thu Feb 8 23:17:23 2024 +0800

[FLINK-34193] Remove usage of Flink-Shaded Jackson and Snakeyaml in 
flink-connector-kafka
---
 flink-connector-kafka/pom.xml   |  7 +++
 .../kafka/testutils/YamlFileMetadataService.java| 17 -
 2 files changed, 15 insertions(+), 9 deletions(-)

diff --git a/flink-connector-kafka/pom.xml b/flink-connector-kafka/pom.xml
index 6510b9c8..529d9252 100644
--- a/flink-connector-kafka/pom.xml
+++ b/flink-connector-kafka/pom.xml
@@ -171,6 +171,13 @@ under the License.
 test
 
 
+
+org.yaml
+snakeyaml
+1.31
+test
+
+
 
 org.apache.flink
 flink-test-utils
diff --git 
a/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
 
b/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
index 32839f37..524f7243 100644
--- 
a/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
+++ 
b/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/testutils/YamlFileMetadataService.java
@@ -23,20 +23,19 @@ import 
org.apache.flink.connector.kafka.dynamic.metadata.ClusterMetadata;
 import org.apache.flink.connector.kafka.dynamic.metadata.KafkaMetadataService;
 import org.apache.flink.connector.kafka.dynamic.metadata.KafkaStream;
 
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.DumperOptions;
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.TypeDescription;
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.Yaml;
-import 
org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.constructor.Constructor;
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.nodes.Node;
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.nodes.SequenceNode;
-import org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.nodes.Tag;
-import 
org.apache.flink.shaded.jackson2.org.yaml.snakeyaml.representer.Representer;
-
 import com.google.common.base.MoreObjects;
 import com.google.common.collect.ImmutableMap;
 import org.apache.kafka.clients.CommonClientConfigs;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
+import org.yaml.snakeyaml.DumperOptions;
+import org.yaml.snakeyaml.TypeDescription;
+import org.yaml.snakeyaml.Yaml;
+import org.yaml.snakeyaml.constructor.Constructor;
+import org.yaml.snakeyaml.nodes.Node;
+import org.yaml.snakeyaml.nodes.SequenceNode;
+import org.yaml.snakeyaml.nodes.Tag;
+import org.yaml.snakeyaml.representer.Representer;
 
 import java.io.File;
 import java.io.FileWriter;



(flink-connector-kafka) branch main updated (cfb275b4 -> 2606a825)

2024-02-12 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


from cfb275b4 [FLINK-34244] Update Confluent Platform to 7.4.4. This closes 
#81
 new b8328ab5 [FLINK-34192] Update to be compatible with updated SinkV2 
interfaces
 new 2606a825 [FLINK-34193] Remove usage of Flink-Shaded Jackson and 
Snakeyaml in flink-connector-kafka

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .github/workflows/push_pr.yml  |   2 +
 flink-connector-kafka/pom.xml  |  11 ++
 .../connector/kafka/sink/KafkaWriterITCase.java| 149 ++---
 .../kafka/testutils/YamlFileMetadataService.java   |  17 ++-
 .../kafka/table/KafkaTableTestUtils.java   |  16 ++-
 5 files changed, 106 insertions(+), 89 deletions(-)



(flink-connector-kafka) branch dependabot/maven/flink-connector-kafka/org.yaml-snakeyaml-2.0 created (now 725727e2)

2024-02-12 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to branch 
dependabot/maven/flink-connector-kafka/org.yaml-snakeyaml-2.0
in repository https://gitbox.apache.org/repos/asf/flink-connector-kafka.git


  at 725727e2 Bump org.yaml:snakeyaml from 1.31 to 2.0 in 
/flink-connector-kafka

No new revisions were added by this update.