(flink) 01/02: [FLINK-33311] `surefire.module.config` args should be before entry point classname

2023-11-22 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 905d5998093127b145c9aa7d32a1667c4b45e850
Author: Sergey Nuyanzin 
AuthorDate: Fri Nov 3 17:57:49 2023 +0100

[FLINK-33311] `surefire.module.config` args should be before entry point 
classname
---
 .../apache/flink/runtime/testutils/TestJvmProcess.java| 15 ---
 1 file changed, 8 insertions(+), 7 deletions(-)

diff --git 
a/flink-runtime/src/test/java/org/apache/flink/runtime/testutils/TestJvmProcess.java
 
b/flink-runtime/src/test/java/org/apache/flink/runtime/testutils/TestJvmProcess.java
index 78cebd083f3..c174f66f411 100644
--- 
a/flink-runtime/src/test/java/org/apache/flink/runtime/testutils/TestJvmProcess.java
+++ 
b/flink-runtime/src/test/java/org/apache/flink/runtime/testutils/TestJvmProcess.java
@@ -141,21 +141,22 @@ public abstract class TestJvmProcess {
 "-Xmx" + jvmMemoryInMb + "m",
 "-classpath",
 getCurrentClasspath(),
-"-XX:+IgnoreUnrecognizedVMOptions",
-getEntryPointClassName()
+"-XX:+IgnoreUnrecognizedVMOptions"
 };
 
+final String moduleConfig = 
System.getProperty("surefire.module.config");
+if (moduleConfig != null) {
+cmd = ArrayUtils.addAll(cmd, moduleConfig.trim().split("\\s+"));
+}
+
+cmd = ArrayUtils.add(cmd, getEntryPointClassName());
+
 String[] jvmArgs = getJvmArgs();
 
 if (jvmArgs != null && jvmArgs.length > 0) {
 cmd = ArrayUtils.addAll(cmd, jvmArgs);
 }
 
-final String moduleConfig = 
System.getProperty("surefire.module.config");
-if (moduleConfig != null) {
-cmd = ArrayUtils.addAll(cmd, moduleConfig.split(" "));
-}
-
 synchronized (createDestroyLock) {
 checkState(process == null, "process already started");
 



(flink) branch master updated (2378babf86c -> 7f2818bea1c)

2023-11-22 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


from 2378babf86c [FLINK-29452][test] Allow unit tests to be executed 
individually
 new 905d5998093 [FLINK-33311] `surefire.module.config` args should be 
before entry point classname
 new 7f2818bea1c [hotfix] Rename `TestJvmProcess#getJvmArgs` to 
`TestJvmProcess#getMainMethodArgs`

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../ClusterUncaughtExceptionHandlerITCase.java |  2 +-
 .../io/disk/FileChannelManagerImplTest.java|  2 +-
 .../flink/runtime/testutils/DispatcherProcess.java |  2 +-
 .../flink/runtime/testutils/TestJvmProcess.java| 23 +++---
 .../testutils/TestingClusterEntrypointProcess.java |  2 +-
 .../flink/runtime/util/BlockingShutdownTest.java   |  2 +-
 .../runtime/util/FlinkSecurityManagerITCase.java   |  2 +-
 .../runtime/util/JvmExitOnFatalErrorTest.java  |  2 +-
 8 files changed, 19 insertions(+), 18 deletions(-)



(flink) 02/02: [hotfix] Rename `TestJvmProcess#getJvmArgs` to `TestJvmProcess#getMainMethodArgs`

2023-11-22 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 7f2818bea1c50d9a092c5360a3d4de3a86c411b2
Author: Sergey Nuyanzin 
AuthorDate: Fri Nov 3 18:03:39 2023 +0100

[hotfix] Rename `TestJvmProcess#getJvmArgs` to 
`TestJvmProcess#getMainMethodArgs`
---
 .../entrypoint/ClusterUncaughtExceptionHandlerITCase.java  |  2 +-
 .../flink/runtime/io/disk/FileChannelManagerImplTest.java  |  2 +-
 .../org/apache/flink/runtime/testutils/DispatcherProcess.java  |  2 +-
 .../org/apache/flink/runtime/testutils/TestJvmProcess.java | 10 +-
 .../runtime/testutils/TestingClusterEntrypointProcess.java |  2 +-
 .../org/apache/flink/runtime/util/BlockingShutdownTest.java|  2 +-
 .../apache/flink/runtime/util/FlinkSecurityManagerITCase.java  |  2 +-
 .../org/apache/flink/runtime/util/JvmExitOnFatalErrorTest.java |  2 +-
 8 files changed, 12 insertions(+), 12 deletions(-)

diff --git 
a/flink-runtime/src/test/java/org/apache/flink/runtime/entrypoint/ClusterUncaughtExceptionHandlerITCase.java
 
b/flink-runtime/src/test/java/org/apache/flink/runtime/entrypoint/ClusterUncaughtExceptionHandlerITCase.java
index 520c8d32591..ae5178310b2 100644
--- 
a/flink-runtime/src/test/java/org/apache/flink/runtime/entrypoint/ClusterUncaughtExceptionHandlerITCase.java
+++ 
b/flink-runtime/src/test/java/org/apache/flink/runtime/entrypoint/ClusterUncaughtExceptionHandlerITCase.java
@@ -130,7 +130,7 @@ public class ClusterUncaughtExceptionHandlerITCase extends 
TestLogger {
 }
 
 @Override
-public String[] getJvmArgs() {
+public String[] getMainMethodArgs() {
 return new String[0];
 }
 
diff --git 
a/flink-runtime/src/test/java/org/apache/flink/runtime/io/disk/FileChannelManagerImplTest.java
 
b/flink-runtime/src/test/java/org/apache/flink/runtime/io/disk/FileChannelManagerImplTest.java
index 17ab746d41a..73b8c565d16 100644
--- 
a/flink-runtime/src/test/java/org/apache/flink/runtime/io/disk/FileChannelManagerImplTest.java
+++ 
b/flink-runtime/src/test/java/org/apache/flink/runtime/io/disk/FileChannelManagerImplTest.java
@@ -185,7 +185,7 @@ class FileChannelManagerImplTest {
 }
 
 @Override
-public String[] getJvmArgs() {
+public String[] getMainMethodArgs() {
 return new String[] {Boolean.toString(callerHasHook), 
tmpDirectories, signalFilePath};
 }
 
diff --git 
a/flink-runtime/src/test/java/org/apache/flink/runtime/testutils/DispatcherProcess.java
 
b/flink-runtime/src/test/java/org/apache/flink/runtime/testutils/DispatcherProcess.java
index 66e7f710546..7f0a247dc21 100644
--- 
a/flink-runtime/src/test/java/org/apache/flink/runtime/testutils/DispatcherProcess.java
+++ 
b/flink-runtime/src/test/java/org/apache/flink/runtime/testutils/DispatcherProcess.java
@@ -78,7 +78,7 @@ public class DispatcherProcess extends TestJvmProcess {
 }
 
 @Override
-public String[] getJvmArgs() {
+public String[] getMainMethodArgs() {
 return jvmArgs;
 }
 
diff --git 
a/flink-runtime/src/test/java/org/apache/flink/runtime/testutils/TestJvmProcess.java
 
b/flink-runtime/src/test/java/org/apache/flink/runtime/testutils/TestJvmProcess.java
index c174f66f411..e9ca964350e 100644
--- 
a/flink-runtime/src/test/java/org/apache/flink/runtime/testutils/TestJvmProcess.java
+++ 
b/flink-runtime/src/test/java/org/apache/flink/runtime/testutils/TestJvmProcess.java
@@ -101,12 +101,12 @@ public abstract class TestJvmProcess {
  *
  * These can be parsed by the main method of the entry point class.
  */
-public abstract String[] getJvmArgs();
+public abstract String[] getMainMethodArgs();
 
 /**
  * Returns the name of the class to run.
  *
- * Arguments to the main method can be specified via {@link 
#getJvmArgs()}.
+ * Arguments to the main method can be specified via {@link 
#getMainMethodArgs()}.
  */
 public abstract String getEntryPointClassName();
 
@@ -151,10 +151,10 @@ public abstract class TestJvmProcess {
 
 cmd = ArrayUtils.add(cmd, getEntryPointClassName());
 
-String[] jvmArgs = getJvmArgs();
+String[] mainMethodArgs = getMainMethodArgs();
 
-if (jvmArgs != null && jvmArgs.length > 0) {
-cmd = ArrayUtils.addAll(cmd, jvmArgs);
+if (mainMethodArgs != null && mainMethodArgs.length > 0) {
+cmd = ArrayUtils.addAll(cmd, mainMethodArgs);
 }
 
 synchronized (createDestroyLock) {
diff --git 
a/flink-runtime/src/test/java/org/apache/flink/runtime/testutils/TestingClusterEntrypointProcess.java
 
b/flink-runtime/src/test/java/org/apache/flink/runtime/testutils/TestingClusterEntrypointProcess.java
index ef9f674a72d..81dd2043ac9 100644
--- 
a/flink-runtime/src/test/java/org/apache/flink/runtime/testutils/TestingClusterEntrypointProcess.java
+++ 
b/flink-runtime/src/test/java/org/apa

(flink) branch master updated: [hotfix][python][docs] Fix broken syntax in Flink Table API query example

2023-11-22 Thread dianfu
This is an automated email from the ASF dual-hosted git repository.

dianfu pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 72798a76228 [hotfix][python][docs] Fix broken syntax in Flink Table 
API query example
72798a76228 is described below

commit 72798a7622830d3052cb6945239dd03e19a754af
Author: Deepyaman Datta 
AuthorDate: Fri Jul 21 13:22:54 2023 -0500

[hotfix][python][docs] Fix broken syntax in Flink Table API query example

This closes #23043.
---
 docs/content.zh/docs/dev/python/table/intro_to_table_api.md | 4 ++--
 docs/content/docs/dev/python/table/intro_to_table_api.md| 4 ++--
 2 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/docs/content.zh/docs/dev/python/table/intro_to_table_api.md 
b/docs/content.zh/docs/dev/python/table/intro_to_table_api.md
index dd08f67402b..98a689d0a0a 100644
--- a/docs/content.zh/docs/dev/python/table/intro_to_table_api.md
+++ b/docs/content.zh/docs/dev/python/table/intro_to_table_api.md
@@ -322,7 +322,7 @@ new_table.execute().print()
 
 ```python
 from pyflink.table import EnvironmentSettings, TableEnvironment
-from pyflink.table.expressions import col
+from pyflink.table.expressions import call, col
 
 # 通过 batch table environment 来执行查询
 env_settings = EnvironmentSettings.in_batch_mode()
@@ -336,7 +336,7 @@ revenue = orders \
 .select(col("name"), col("country"), col("revenue")) \
 .where(col("country") == 'FRANCE') \
 .group_by(col("name")) \
-.select(col("name"), col("country").sum.alias('rev_sum'))
+.select(col("name"), call("sum", col("revenue")).alias('rev_sum'))
 
 revenue.execute().print()
 ```
diff --git a/docs/content/docs/dev/python/table/intro_to_table_api.md 
b/docs/content/docs/dev/python/table/intro_to_table_api.md
index ce540a68610..43e657514f1 100644
--- a/docs/content/docs/dev/python/table/intro_to_table_api.md
+++ b/docs/content/docs/dev/python/table/intro_to_table_api.md
@@ -323,7 +323,7 @@ The following example shows a simple Table API aggregation 
query:
 
 ```python
 from pyflink.table import EnvironmentSettings, TableEnvironment
-from pyflink.table.expressions import col
+from pyflink.table.expressions import call, col
 
 # using batch table environment to execute the queries
 env_settings = EnvironmentSettings.in_batch_mode()
@@ -337,7 +337,7 @@ revenue = orders \
 .select(col("name"), col("country"), col("revenue")) \
 .where(col("country") == 'FRANCE') \
 .group_by(col("name")) \
-.select(col("name"), col("country").sum.alias('rev_sum'))
+.select(col("name"), call("sum", col("revenue")).alias('rev_sum'))
 
 revenue.execute().print()
 ```



(flink) branch release-1.18 updated: [hotfix][python][docs] Fix broken syntax in Flink Table API query example

2023-11-22 Thread dianfu
This is an automated email from the ASF dual-hosted git repository.

dianfu pushed a commit to branch release-1.18
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.18 by this push:
 new dbbe4c11f14 [hotfix][python][docs] Fix broken syntax in Flink Table 
API query example
dbbe4c11f14 is described below

commit dbbe4c11f14a9341afa6dd4470feae46b291ddcc
Author: Deepyaman Datta 
AuthorDate: Fri Jul 21 13:22:54 2023 -0500

[hotfix][python][docs] Fix broken syntax in Flink Table API query example

This closes #23043.
---
 docs/content.zh/docs/dev/python/table/intro_to_table_api.md | 4 ++--
 docs/content/docs/dev/python/table/intro_to_table_api.md| 4 ++--
 2 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/docs/content.zh/docs/dev/python/table/intro_to_table_api.md 
b/docs/content.zh/docs/dev/python/table/intro_to_table_api.md
index dd08f67402b..98a689d0a0a 100644
--- a/docs/content.zh/docs/dev/python/table/intro_to_table_api.md
+++ b/docs/content.zh/docs/dev/python/table/intro_to_table_api.md
@@ -322,7 +322,7 @@ new_table.execute().print()
 
 ```python
 from pyflink.table import EnvironmentSettings, TableEnvironment
-from pyflink.table.expressions import col
+from pyflink.table.expressions import call, col
 
 # 通过 batch table environment 来执行查询
 env_settings = EnvironmentSettings.in_batch_mode()
@@ -336,7 +336,7 @@ revenue = orders \
 .select(col("name"), col("country"), col("revenue")) \
 .where(col("country") == 'FRANCE') \
 .group_by(col("name")) \
-.select(col("name"), col("country").sum.alias('rev_sum'))
+.select(col("name"), call("sum", col("revenue")).alias('rev_sum'))
 
 revenue.execute().print()
 ```
diff --git a/docs/content/docs/dev/python/table/intro_to_table_api.md 
b/docs/content/docs/dev/python/table/intro_to_table_api.md
index ce540a68610..43e657514f1 100644
--- a/docs/content/docs/dev/python/table/intro_to_table_api.md
+++ b/docs/content/docs/dev/python/table/intro_to_table_api.md
@@ -323,7 +323,7 @@ The following example shows a simple Table API aggregation 
query:
 
 ```python
 from pyflink.table import EnvironmentSettings, TableEnvironment
-from pyflink.table.expressions import col
+from pyflink.table.expressions import call, col
 
 # using batch table environment to execute the queries
 env_settings = EnvironmentSettings.in_batch_mode()
@@ -337,7 +337,7 @@ revenue = orders \
 .select(col("name"), col("country"), col("revenue")) \
 .where(col("country") == 'FRANCE') \
 .group_by(col("name")) \
-.select(col("name"), col("country").sum.alias('rev_sum'))
+.select(col("name"), call("sum", col("revenue")).alias('rev_sum'))
 
 revenue.execute().print()
 ```



(flink) branch master updated (72798a76228 -> f2260a8702e)

2023-11-22 Thread mapohl
This is an automated email from the ASF dual-hosted git repository.

mapohl pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


from 72798a76228 [hotfix][python][docs] Fix broken syntax in Flink Table 
API query example
 new 262c967fc07 [FLINK-33503][build] Upgrading Maven wrapper from 3.1.0 to 
3.2.0
 new f2260a8702e [FLINK-33503][build] Adds checksum and comment for Maven 
wrapper download

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .mvn/wrapper/maven-wrapper.properties |   9 +-
 mvnw  | 218 +--
 mvnw.cmd  | 393 ++
 3 files changed, 316 insertions(+), 304 deletions(-)



(flink) 01/02: [FLINK-33503][build] Upgrading Maven wrapper from 3.1.0 to 3.2.0

2023-11-22 Thread mapohl
This is an automated email from the ASF dual-hosted git repository.

mapohl pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 262c967fc0739760e0993330120e412a1715847a
Author: Matthias Pohl 
AuthorDate: Tue Nov 21 12:02:21 2023 +0100

[FLINK-33503][build] Upgrading Maven wrapper from 3.1.0 to 3.2.0

This update was performed by calling ./mvnw wrapper:wrapper
---
 .mvn/wrapper/maven-wrapper.properties |   6 +-
 mvnw  | 218 +--
 mvnw.cmd  | 393 ++
 3 files changed, 313 insertions(+), 304 deletions(-)

diff --git a/.mvn/wrapper/maven-wrapper.properties 
b/.mvn/wrapper/maven-wrapper.properties
index 57bb584385e..6f40a26edcf 100644
--- a/.mvn/wrapper/maven-wrapper.properties
+++ b/.mvn/wrapper/maven-wrapper.properties
@@ -5,9 +5,9 @@
 # to you under the Apache License, Version 2.0 (the
 # "License"); you may not use this file except in compliance
 # with the License.  You may obtain a copy of the License at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in writing,
 # software distributed under the License is distributed on an
 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@@ -15,4 +15,4 @@
 # specific language governing permissions and limitations
 # under the License.
 
distributionUrl=https://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.8.6/apache-maven-3.8.6-bin.zip
-wrapperUrl=https://repo.maven.apache.org/maven2/org/apache/maven/wrapper/maven-wrapper/3.1.0/maven-wrapper-3.1.0.jar
+wrapperUrl=https://repo.maven.apache.org/maven2/org/apache/maven/wrapper/maven-wrapper/3.2.0/maven-wrapper-3.2.0.jar
diff --git a/mvnw b/mvnw
index 5643201c7d8..8d937f4c14f 100755
--- a/mvnw
+++ b/mvnw
@@ -19,7 +19,7 @@
 # 
 
 # 
-# Maven Start Up Batch script
+# Apache Maven Wrapper startup batch script, version 3.2.0
 #
 # Required ENV vars:
 # --
@@ -27,7 +27,6 @@
 #
 # Optional ENV vars
 # -
-#   M2_HOME - location of maven2's installed home dir
 #   MAVEN_OPTS - parameters passed to the Java VM when running Maven
 # e.g. to debug Maven itself, use
 #   set MAVEN_OPTS=-Xdebug 
-Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000
@@ -54,7 +53,7 @@ fi
 cygwin=false;
 darwin=false;
 mingw=false
-case "`uname`" in
+case "$(uname)" in
   CYGWIN*) cygwin=true ;;
   MINGW*) mingw=true;;
   Darwin*) darwin=true
@@ -62,9 +61,9 @@ case "`uname`" in
 # See https://developer.apple.com/library/mac/qa/qa1170/_index.html
 if [ -z "$JAVA_HOME" ]; then
   if [ -x "/usr/libexec/java_home" ]; then
-export JAVA_HOME="`/usr/libexec/java_home`"
+JAVA_HOME="$(/usr/libexec/java_home)"; export JAVA_HOME
   else
-export JAVA_HOME="/Library/Java/Home"
+JAVA_HOME="/Library/Java/Home"; export JAVA_HOME
   fi
 fi
 ;;
@@ -72,68 +71,38 @@ esac
 
 if [ -z "$JAVA_HOME" ] ; then
   if [ -r /etc/gentoo-release ] ; then
-JAVA_HOME=`java-config --jre-home`
+JAVA_HOME=$(java-config --jre-home)
   fi
 fi
 
-if [ -z "$M2_HOME" ] ; then
-  ## resolve links - $0 may be a link to maven's home
-  PRG="$0"
-
-  # need this for relative symlinks
-  while [ -h "$PRG" ] ; do
-ls=`ls -ld "$PRG"`
-link=`expr "$ls" : '.*-> \(.*\)$'`
-if expr "$link" : '/.*' > /dev/null; then
-  PRG="$link"
-else
-  PRG="`dirname "$PRG"`/$link"
-fi
-  done
-
-  saveddir=`pwd`
-
-  M2_HOME=`dirname "$PRG"`/..
-
-  # make it fully qualified
-  M2_HOME=`cd "$M2_HOME" && pwd`
-
-  cd "$saveddir"
-  # echo Using m2 at $M2_HOME
-fi
-
 # For Cygwin, ensure paths are in UNIX format before anything is touched
 if $cygwin ; then
-  [ -n "$M2_HOME" ] &&
-M2_HOME=`cygpath --unix "$M2_HOME"`
   [ -n "$JAVA_HOME" ] &&
-JAVA_HOME=`cygpath --unix "$JAVA_HOME"`
+JAVA_HOME=$(cygpath --unix "$JAVA_HOME")
   [ -n "$CLASSPATH" ] &&
-CLASSPATH=`cygpath --path --unix "$CLASSPATH"`
+CLASSPATH=$(cygpath --path --unix "$CLASSPATH")
 fi
 
 # For Mingw, ensure paths are in UNIX format before anything is touched
 if $mingw ; then
-  [ -n "$M2_HOME" ] &&
-M2_HOME="`(cd "$M2_HOME"; pwd)`"
-  [ -n "$JAVA_HOME" ] &&
-JAVA_HOME="`(cd "$JAVA_HOME"; pwd)`"
+  [ -n "$JAVA_HOME" ] && [ -d "$JAVA_HOME" ] &&
+JAVA_HOME="$(cd "$JAVA_HOME" || (echo "cannot cd into $JAVA_HOME."; exit 
1); pwd)"
 fi
 
 if [ -z "$JAVA_HOME" ]; then
-  javaExecutable="`which javac`"
-  if [ -n "$javaExecutable" ] && ! [ "`expr \"$javaExecutable\" : '\([^ 
]*\)'`" = "no" ]; then
+  javaExecutable="$(which javac)"
+  if [ -n "$javaExecutable" ] && ! [ "$(expr "\"$javaExecutable\"" : '\([^ 
]*\)')" = "no" ]; then
 # readlink(1) is not available as standard on Solaris 10.
-  

(flink) 02/02: [FLINK-33503][build] Adds checksum and comment for Maven wrapper download

2023-11-22 Thread mapohl
This is an automated email from the ASF dual-hosted git repository.

mapohl pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit f2260a8702e9898ed8312f12de5b53483ccd5780
Author: Matthias Pohl 
AuthorDate: Tue Nov 21 12:20:10 2023 +0100

[FLINK-33503][build] Adds checksum and comment for Maven wrapper download
---
 .mvn/wrapper/maven-wrapper.properties | 3 +++
 1 file changed, 3 insertions(+)

diff --git a/.mvn/wrapper/maven-wrapper.properties 
b/.mvn/wrapper/maven-wrapper.properties
index 6f40a26edcf..965f39f204b 100644
--- a/.mvn/wrapper/maven-wrapper.properties
+++ b/.mvn/wrapper/maven-wrapper.properties
@@ -15,4 +15,7 @@
 # specific language governing permissions and limitations
 # under the License.
 
distributionUrl=https://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.8.6/apache-maven-3.8.6-bin.zip
+distributionSha256Sum=ccf20a80e75a17ffc34d47c5c95c98c39d426ca17d670f09cd91e877072a9309
 
wrapperUrl=https://repo.maven.apache.org/maven2/org/apache/maven/wrapper/maven-wrapper/3.2.0/maven-wrapper-3.2.0.jar
+# TODO FLINK-33607: checksum verification doesn't seem to work under windows
+# 
wrapperSha256Sum=e63a53cfb9c4d291ebe3c2b0edacb7622bbc480326beaa5a0456e412f52f066a



(flink) branch master updated: [FLINK-33225][python] Parse `JVM_ARGS` as an array

2023-11-22 Thread dianfu
This is an automated email from the ASF dual-hosted git repository.

dianfu pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new b1bfd70ad8a [FLINK-33225][python] Parse `JVM_ARGS` as an array
b1bfd70ad8a is described below

commit b1bfd70ad8a9e4e1a710dc5775837ba7102d4b70
Author: Deepyaman Datta 
AuthorDate: Mon Oct 9 11:17:06 2023 -0600

[FLINK-33225][python] Parse `JVM_ARGS` as an array

This closes #23500.
---
 flink-python/pyflink/pyflink_gateway_server.py | 33 ++
 1 file changed, 23 insertions(+), 10 deletions(-)

diff --git a/flink-python/pyflink/pyflink_gateway_server.py 
b/flink-python/pyflink/pyflink_gateway_server.py
index 07bba321bde..38cc557166c 100644
--- a/flink-python/pyflink/pyflink_gateway_server.py
+++ b/flink-python/pyflink/pyflink_gateway_server.py
@@ -163,9 +163,9 @@ def get_jvm_opts(env):
 read_from_config(KEY_ENV_JAVA_OPTS_DEPRECATED, "", 
flink_conf_file),
 flink_conf_file))
 
-# Remove leading and ending double quotes (if present) of value
-jvm_opts = jvm_opts.strip("\"")
-return jvm_opts.split(" ")
+# Remove leading and trailing double quotes (if present) of value
+jvm_opts = jvm_opts.strip('"')
+return jvm_opts.split()
 
 
 def construct_flink_classpath(env):
@@ -248,19 +248,32 @@ def launch_gateway_server_process(env, args):
 if program_args.cluster_type == "local":
 java_executable = find_java_executable()
 log_settings = construct_log_settings(env)
-jvm_args = env.get('JVM_ARGS', '')
+jvm_args = env.get('JVM_ARGS', '').split()
 jvm_opts = get_jvm_opts(env)
 classpath = os.pathsep.join(
 [construct_flink_classpath(env), construct_hadoop_classpath(env)])
 if "FLINK_TESTING" in env:
 classpath = os.pathsep.join([classpath, 
construct_test_classpath()])
-command = [java_executable, jvm_args, 
"-XX:+IgnoreUnrecognizedVMOptions",
-   "--add-opens=jdk.proxy2/jdk.proxy2=ALL-UNNAMED"] \
-+ jvm_opts + log_settings \
-+ ["-cp", classpath, program_args.main_class] + 
program_args.other_args
+command = [
+java_executable,
+*jvm_args,
+"-XX:+IgnoreUnrecognizedVMOptions",
+"--add-opens=jdk.proxy2/jdk.proxy2=ALL-UNNAMED",
+*jvm_opts,
+*log_settings,
+"-cp",
+classpath,
+program_args.main_class,
+*program_args.other_args,
+]
 else:
-command = [os.path.join(env["FLINK_BIN_DIR"], "flink"), "run"] + 
program_args.other_args \
-+ ["-c", program_args.main_class]
+command = [
+os.path.join(env["FLINK_BIN_DIR"], "flink"),
+"run",
+*program_args.other_args,
+"-c",
+program_args.main_class,
+]
 preexec_fn = None
 if not on_windows():
 def preexec_func():



(flink) branch release-1.18 updated: [FLINK-33225][python] Parse `JVM_ARGS` as an array

2023-11-22 Thread dianfu
This is an automated email from the ASF dual-hosted git repository.

dianfu pushed a commit to branch release-1.18
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.18 by this push:
 new 73273ae1669 [FLINK-33225][python] Parse `JVM_ARGS` as an array
73273ae1669 is described below

commit 73273ae1669c9f01b61667dedb85a7e745d6bbe2
Author: Deepyaman Datta 
AuthorDate: Mon Oct 9 11:17:06 2023 -0600

[FLINK-33225][python] Parse `JVM_ARGS` as an array

This closes #23500.
---
 flink-python/pyflink/pyflink_gateway_server.py | 33 ++
 1 file changed, 23 insertions(+), 10 deletions(-)

diff --git a/flink-python/pyflink/pyflink_gateway_server.py 
b/flink-python/pyflink/pyflink_gateway_server.py
index 07bba321bde..38cc557166c 100644
--- a/flink-python/pyflink/pyflink_gateway_server.py
+++ b/flink-python/pyflink/pyflink_gateway_server.py
@@ -163,9 +163,9 @@ def get_jvm_opts(env):
 read_from_config(KEY_ENV_JAVA_OPTS_DEPRECATED, "", 
flink_conf_file),
 flink_conf_file))
 
-# Remove leading and ending double quotes (if present) of value
-jvm_opts = jvm_opts.strip("\"")
-return jvm_opts.split(" ")
+# Remove leading and trailing double quotes (if present) of value
+jvm_opts = jvm_opts.strip('"')
+return jvm_opts.split()
 
 
 def construct_flink_classpath(env):
@@ -248,19 +248,32 @@ def launch_gateway_server_process(env, args):
 if program_args.cluster_type == "local":
 java_executable = find_java_executable()
 log_settings = construct_log_settings(env)
-jvm_args = env.get('JVM_ARGS', '')
+jvm_args = env.get('JVM_ARGS', '').split()
 jvm_opts = get_jvm_opts(env)
 classpath = os.pathsep.join(
 [construct_flink_classpath(env), construct_hadoop_classpath(env)])
 if "FLINK_TESTING" in env:
 classpath = os.pathsep.join([classpath, 
construct_test_classpath()])
-command = [java_executable, jvm_args, 
"-XX:+IgnoreUnrecognizedVMOptions",
-   "--add-opens=jdk.proxy2/jdk.proxy2=ALL-UNNAMED"] \
-+ jvm_opts + log_settings \
-+ ["-cp", classpath, program_args.main_class] + 
program_args.other_args
+command = [
+java_executable,
+*jvm_args,
+"-XX:+IgnoreUnrecognizedVMOptions",
+"--add-opens=jdk.proxy2/jdk.proxy2=ALL-UNNAMED",
+*jvm_opts,
+*log_settings,
+"-cp",
+classpath,
+program_args.main_class,
+*program_args.other_args,
+]
 else:
-command = [os.path.join(env["FLINK_BIN_DIR"], "flink"), "run"] + 
program_args.other_args \
-+ ["-c", program_args.main_class]
+command = [
+os.path.join(env["FLINK_BIN_DIR"], "flink"),
+"run",
+*program_args.other_args,
+"-c",
+program_args.main_class,
+]
 preexec_fn = None
 if not on_windows():
 def preexec_func():



(flink) branch master updated (b1bfd70ad8a -> c61c09e4640)

2023-11-22 Thread snuyanzin
This is an automated email from the ASF dual-hosted git repository.

snuyanzin pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


from b1bfd70ad8a [FLINK-33225][python] Parse `JVM_ARGS` as an array
 new d92423069b0 [FLINK-33591][table] Cleanup usage of deprecated 
TableTestBase#addFunction
 new c61c09e4640 [FLINK-31597][table] Cleanup usage of deprecated 
TableEnvironment#registerFunction

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 docs/themes/book   |   2 +-
 .../PythonScalarFunctionOperatorTestBase.java  |   2 +-
 .../planner/plan/common/CalcMergeTestBase.java |   3 +-
 .../PushFilterIntoTableSourceScanRuleTestBase.java |   2 +-
 .../plan/utils/JavaUserDefinedAggFunctions.java|   5 +-
 .../runtime/utils/JavaUserDefinedAggFunctions.java |   5 +-
 .../utils/JavaUserDefinedScalarFunctions.java  |   2 +-
 .../planner/plan/batch/sql/DagOptimizationTest.xml |  12 +--
 .../plan/batch/sql/LegacyTableSourceTest.xml   |   2 +-
 .../planner/plan/batch/sql/RemoveCollationTest.xml |  26 ++---
 .../planner/plan/batch/sql/RemoveShuffleTest.xml   |  14 +--
 .../planner/plan/batch/sql/SubplanReuseTest.xml|  24 ++---
 .../sql/join/BroadcastHashSemiAntiJoinTest.xml |  20 ++--
 .../batch/sql/join/NestedLoopSemiAntiJoinTest.xml  |  20 ++--
 .../plan/batch/sql/join/SemiAntiJoinTest.xml   |  20 ++--
 .../sql/join/ShuffledHashSemiAntiJoinTest.xml  |  20 ++--
 .../batch/sql/join/SortMergeSemiAntiJoinTest.xml   |  20 ++--
 .../table/planner/plan/batch/table/CalcTest.xml|   2 +-
 .../CalcPythonCorrelateTransposeRuleTest.xml   |   8 +-
 .../rules/logical/ExpressionReductionRulesTest.xml |   2 +-
 .../ProjectSemiAntiJoinTransposeRuleTest.xml   |   4 +-
 .../rules/logical/PythonCorrelateSplitRuleTest.xml |  24 ++---
 .../SplitPythonConditionFromCorrelateRuleTest.xml  |   4 +-
 .../logical/subquery/SubQueryAntiJoinTest.xml  |  16 +--
 .../logical/subquery/SubQuerySemiJoinTest.xml  |  40 +++
 .../plan/stream/sql/DagOptimizationTest.xml|  16 +--
 .../plan/stream/sql/LegacyTableSourceTest.xml  |   2 +-
 .../stream/sql/RelTimeIndicatorConverterTest.xml   |   6 +-
 .../planner/plan/stream/sql/SubplanReuseTest.xml   |  26 ++---
 .../plan/stream/sql/join/SemiAntiJoinTest.xml  |  20 ++--
 .../plan/stream/table/ColumnFunctionsTest.xml  |   6 +-
 .../table/validation/AggregateValidationTest.xml}  |  14 +--
 .../sql/validation/OverWindowValidationTest.scala  |   2 +-
 .../table/planner/catalog/CatalogTableITCase.scala |   2 +-
 .../codegen/WatermarkGeneratorCodeGenTest.scala|   2 +-
 .../expressions/utils/ExpressionTestBase.scala |   2 +-
 .../utils/userDefinedScalarFunctions.scala |   6 +-
 .../utils/TestCollectionTableFactory.scala |   1 -
 .../table/planner/plan/batch/sql/CalcTest.scala|   2 +-
 .../plan/batch/sql/DagOptimizationTest.scala   |   6 +-
 .../plan/batch/sql/LegacyTableSourceTest.scala |   4 +-
 .../plan/batch/sql/PartitionableSourceTest.scala   |   2 +-
 .../plan/batch/sql/RemoveCollationTest.scala   |   9 +-
 .../planner/plan/batch/sql/RemoveShuffleTest.scala |   7 +-
 .../planner/plan/batch/sql/SubplanReuseTest.scala  |  10 +-
 .../planner/plan/batch/sql/TableScanTest.scala |   2 +-
 .../plan/batch/sql/agg/AggregateTestBase.scala |   4 +-
 .../plan/batch/sql/agg/GroupWindowTest.scala   |  10 +-
 .../plan/batch/sql/agg/OverAggregateTest.scala |   4 +-
 .../plan/batch/sql/join/LookupJoinTest.scala   |   4 +-
 .../plan/batch/sql/join/SemiAntiJoinTestBase.scala |   2 +-
 .../plan/batch/sql/join/SingleRowJoinTest.scala|   1 -
 .../table/planner/plan/batch/table/CalcTest.scala  |   2 +-
 .../planner/plan/batch/table/PythonCalcTest.scala  |   2 +-
 .../table/validation/AggregateValidationTest.scala |   4 +-
 .../CalcPythonCorrelateTransposeRuleTest.scala |   4 +-
 .../logical/ExpressionReductionRulesTest.scala |  10 +-
 .../rules/logical/FlinkCalcMergeRuleTest.scala |   2 +-
 ...lCorrelateToJoinFromTemporalTableRuleTest.scala |   4 +-
 .../ProjectSemiAntiJoinTransposeRuleTest.scala |   2 +-
 ...ojectWindowTableFunctionTransposeRuleTest.scala |   2 +-
 ...artitionIntoLegacyTableSourceScanRuleTest.scala |   4 +-
 ...hProjectIntoLegacyTableSourceScanRuleTest.scala |   2 +-
 .../rules/logical/PythonCalcSplitRuleTest.scala|   2 +-
 .../logical/PythonCorrelateSplitRuleTest.scala |   6 +-
 ...SplitPythonConditionFromCorrelateRuleTest.scala |   4 +-
 .../SplitPythonConditionFromJoinRuleTest.scala |   2 +-
 .../logical/subquery/SubQueryAntiJoinTest.scala|   2 +-
 .../logical/subquery/SubQuerySemiJoinTest.scala|   2 +-
 .../batch/EnforceLocalSortAggRuleTest.scala|   3

(flink) 02/02: [FLINK-31597][table] Cleanup usage of deprecated TableEnvironment#registerFunction

2023-11-22 Thread snuyanzin
This is an automated email from the ASF dual-hosted git repository.

snuyanzin pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit c61c09e464073fae430cab2dd56bd608f9d275fd
Author: Sergey Nuyanzin 
AuthorDate: Sun Nov 19 20:18:48 2023 +0100

[FLINK-31597][table] Cleanup usage of deprecated 
TableEnvironment#registerFunction

This closes #23751
---
 docs/themes/book   |   2 +-
 .../PythonScalarFunctionOperatorTestBase.java  |   2 +-
 .../table/planner/plan/batch/table/CalcTest.xml|   2 +-
 .../rules/logical/ExpressionReductionRulesTest.xml |   2 +-
 .../table/planner/catalog/CatalogTableITCase.scala |   2 +-
 .../codegen/WatermarkGeneratorCodeGenTest.scala|   2 +-
 .../expressions/utils/ExpressionTestBase.scala |   2 +-
 .../utils/userDefinedScalarFunctions.scala |   6 +-
 .../utils/TestCollectionTableFactory.scala |   1 -
 .../plan/batch/sql/DagOptimizationTest.scala   |   2 +-
 .../plan/batch/sql/RemoveCollationTest.scala   |   1 -
 .../planner/plan/batch/sql/RemoveShuffleTest.scala |   1 -
 .../planner/plan/batch/sql/SubplanReuseTest.scala  |   6 +-
 .../planner/plan/batch/sql/TableScanTest.scala |   2 +-
 .../plan/batch/sql/join/LookupJoinTest.scala   |   2 +-
 .../plan/batch/sql/join/SingleRowJoinTest.scala|   1 -
 .../table/planner/plan/batch/table/CalcTest.scala  |   2 +-
 ...lCorrelateToJoinFromTemporalTableRuleTest.scala |   4 +-
 ...ojectWindowTableFunctionTransposeRuleTest.scala |   2 +-
 ...hProjectIntoLegacyTableSourceScanRuleTest.scala |   2 +-
 .../rules/logical/PythonCalcSplitRuleTest.scala|   2 +-
 .../batch/EnforceLocalSortAggRuleTest.scala|   1 -
 .../plan/stream/sql/DagOptimizationTest.scala  |   2 +-
 .../plan/stream/sql/FilterableSourceTest.scala |   2 +-
 .../plan/stream/sql/SourceWatermarkTest.scala  |   2 +-
 .../planner/plan/stream/sql/SubplanReuseTest.scala |   6 +-
 .../planner/plan/stream/sql/TableScanTest.scala|   6 +-
 .../table/validation/CalcValidationTest.scala  |   2 +-
 .../validation/GroupWindowValidationTest.scala |   1 -
 .../plan/utils/ColumnIntervalUtilTest.scala|   2 +-
 .../table/planner/plan/utils/RexNodeTestBase.scala |   2 +-
 .../table/planner/plan/utils/lookupFunctions.scala |   1 -
 .../planner/runtime/batch/sql/CalcITCase.scala |  58 +-
 .../runtime/batch/sql/CorrelateITCase.scala| 120 -
 .../runtime/batch/sql/CorrelateITCase2.scala   |  41 ---
 .../planner/runtime/batch/sql/DecimalITCase.scala  |   2 -
 .../runtime/batch/sql/OverAggregateITCase.scala|  21 ++--
 .../batch/sql/PartitionableSinkITCase.scala|   2 +-
 .../sql/agg/AggregateReduceGroupingITCase.scala|   1 -
 .../sql/agg/DistinctAggregateITCaseBase.scala  |   2 -
 .../runtime/batch/sql/agg/GroupWindowITCase.scala  |   7 +-
 .../runtime/batch/sql/agg/GroupingSetsITCase.scala |   2 -
 .../batch/sql/agg/PruneAggregateCallITCase.scala   |   1 -
 .../runtime/batch/sql/agg/SortAggITCase.scala  | 110 +--
 .../sql/agg/SortDistinctAggregateITCase.scala  |   7 +-
 .../runtime/batch/sql/join/JoinITCase.scala|   5 +-
 .../planner/runtime/batch/table/CalcITCase.scala   |   8 +-
 .../runtime/batch/table/CorrelateITCase.scala  |   8 +-
 .../runtime/batch/table/DecimalITCase.scala|   1 -
 .../runtime/batch/table/GroupWindowITCase.scala|   2 +-
 ...WindowAggregateUseDaylightTimeHarnessTest.scala |   3 +-
 .../runtime/stream/FsStreamingSinkITCaseBase.scala |   1 -
 .../runtime/stream/sql/AggregateITCase.scala   |   4 +-
 .../runtime/stream/sql/AsyncLookupJoinITCase.scala |   8 +-
 .../runtime/stream/sql/ChangelogSourceITCase.scala |   1 -
 .../runtime/stream/sql/CorrelateITCase.scala   |  20 ++--
 .../stream/sql/FsStreamingSinkTestCsvITCase.scala  |   2 -
 .../planner/runtime/stream/sql/JoinITCase.scala|   2 +-
 .../runtime/stream/sql/LookupJoinITCase.scala  |   4 +-
 .../runtime/stream/sql/MatchRecognizeITCase.scala  |   2 +-
 .../runtime/stream/sql/OverAggregateITCase.scala   |   2 +-
 .../stream/sql/PruneAggregateCallITCase.scala  |   1 -
 .../planner/runtime/stream/sql/RankITCase.scala|   2 -
 .../runtime/stream/sql/TemporalSortITCase.scala|   2 -
 .../sql/TemporalTableFunctionJoinITCase.scala  |   2 +-
 .../stream/sql/WindowDistinctAggregateITCase.scala |   1 -
 .../planner/runtime/stream/table/CalcITCase.scala  |  10 +-
 .../runtime/stream/table/CorrelateITCase.scala |  10 +-
 .../planner/runtime/utils/BatchTableEnvUtil.scala  |   1 -
 .../planner/runtime/utils/BatchTestBase.scala  |  23 +---
 .../utils/UserDefinedFunctionTestUtils.scala   |  77 ++---
 .../flink/table/planner/utils/AvgAggFunction.scala |  28 -
 .../table/planner/utils/CountAggFunction.scala |  25 +++--
 .../planner/utils/MemoryTableSourceSinkUtil.scala  |   2 +-
 .../flink/table/planner

(flink-kubernetes-operator) branch dependabot/maven/com.puppycrawl.tools-checkstyle-8.29 deleted (was cffd45bf)

2023-11-22 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to branch 
dependabot/maven/com.puppycrawl.tools-checkstyle-8.29
in repository https://gitbox.apache.org/repos/asf/flink-kubernetes-operator.git


 was cffd45bf Bump checkstyle from 8.14 to 8.29

The revisions that were on this branch are still contained in
other references; therefore, this change does not discard any commits
from the repository.



(flink-connector-elasticsearch) branch dependabot/maven/flink-connector-elasticsearch-base/org.elasticsearch-elasticsearch-7.17.14 created (now 6e42d85)

2023-11-22 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to branch 
dependabot/maven/flink-connector-elasticsearch-base/org.elasticsearch-elasticsearch-7.17.14
in repository 
https://gitbox.apache.org/repos/asf/flink-connector-elasticsearch.git


  at 6e42d85  Bump org.elasticsearch:elasticsearch

No new revisions were added by this update.



(flink-connector-elasticsearch) branch dependabot/maven/flink-connector-elasticsearch-base/org.elasticsearch-elasticsearch-7.17.13 deleted (was 24cd74a)

2023-11-22 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to branch 
dependabot/maven/flink-connector-elasticsearch-base/org.elasticsearch-elasticsearch-7.17.13
in repository 
https://gitbox.apache.org/repos/asf/flink-connector-elasticsearch.git


 was 24cd74a  Bump org.elasticsearch:elasticsearch

The revisions that were on this branch are still contained in
other references; therefore, this change does not discard any commits
from the repository.



(flink) branch master updated: [FLINK-29452][test] Allow unit tests to be executed individually

2023-11-22 Thread mapohl
This is an automated email from the ASF dual-hosted git repository.

mapohl pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 2378babf86c [FLINK-29452][test] Allow unit tests to be executed 
individually
2378babf86c is described below

commit 2378babf86cd298525ef58c41f019d5c4d900383
Author: Ryan Skraba 
AuthorDate: Thu Nov 10 14:54:56 2022 +0100

[FLINK-29452][test] Allow unit tests to be executed individually

[hotfix] Add missing space padding for test retry message

[FLINK-29452][test] Refactor for helper methods from review

Apply suggestions from code review

Co-authored-by: Matthias Pohl 

Change private HashMap<> to interface
---
 .../retry/RetryTestExecutionExtension.java |  2 +-
 .../junit/RetryOnExceptionExtensionTest.java   | 77 +++---
 .../junit/RetryOnFailureExtensionTest.java | 75 ++---
 3 files changed, 106 insertions(+), 48 deletions(-)

diff --git 
a/flink-test-utils-parent/flink-test-utils-junit/src/main/java/org/apache/flink/testutils/junit/extensions/retry/RetryTestExecutionExtension.java
 
b/flink-test-utils-parent/flink-test-utils-junit/src/main/java/org/apache/flink/testutils/junit/extensions/retry/RetryTestExecutionExtension.java
index 8126c74015a..74e56db5944 100644
--- 
a/flink-test-utils-parent/flink-test-utils-junit/src/main/java/org/apache/flink/testutils/junit/extensions/retry/RetryTestExecutionExtension.java
+++ 
b/flink-test-utils-parent/flink-test-utils-junit/src/main/java/org/apache/flink/testutils/junit/extensions/retry/RetryTestExecutionExtension.java
@@ -51,7 +51,7 @@ public class RetryTestExecutionExtension
 RetryStrategy retryStrategy = getRetryStrategyInStore(context);
 String method = getTestMethodKey(context);
 if (!retryStrategy.hasNextAttempt()) {
-return ConditionEvaluationResult.disabled(method + "has already 
passed or failed.");
+return ConditionEvaluationResult.disabled(method + " has already 
passed or failed.");
 }
 return ConditionEvaluationResult.enabled(
 String.format("Test %s[%d/%d]", method, retryIndex, 
totalTimes));
diff --git 
a/flink-test-utils-parent/flink-test-utils-junit/src/test/java/org/apache/flink/testutils/junit/RetryOnExceptionExtensionTest.java
 
b/flink-test-utils-parent/flink-test-utils-junit/src/test/java/org/apache/flink/testutils/junit/RetryOnExceptionExtensionTest.java
index d1186a489f5..3ff0d64bc40 100644
--- 
a/flink-test-utils-parent/flink-test-utils-junit/src/test/java/org/apache/flink/testutils/junit/RetryOnExceptionExtensionTest.java
+++ 
b/flink-test-utils-parent/flink-test-utils-junit/src/test/java/org/apache/flink/testutils/junit/RetryOnExceptionExtensionTest.java
@@ -22,12 +22,17 @@ import 
org.apache.flink.testutils.junit.extensions.retry.RetryExtension;
 import 
org.apache.flink.testutils.junit.extensions.retry.strategy.RetryOnExceptionStrategy;
 
 import org.junit.jupiter.api.AfterAll;
+import org.junit.jupiter.api.BeforeEach;
+import org.junit.jupiter.api.TestInfo;
 import org.junit.jupiter.api.TestTemplate;
 import org.junit.jupiter.api.extension.ExtendWith;
 import org.junit.jupiter.params.ParameterizedTest;
 import org.junit.jupiter.params.provider.MethodSource;
 import org.opentest4j.TestAbortedException;
 
+import java.util.HashMap;
+import java.util.Map;
+import java.util.function.Consumer;
 import java.util.stream.Stream;
 
 import static org.assertj.core.api.Assertions.assertThat;
@@ -37,53 +42,77 @@ import static 
org.assertj.core.api.Assertions.assertThatThrownBy;
 @ExtendWith(RetryExtension.class)
 class RetryOnExceptionExtensionTest {
 
-private static final int NUMBER_OF_RUNS = 3;
+private static final int NUMBER_OF_RETRIES = 3;
 
-private static int runsForSuccessfulTest = 0;
+private static final Map methodRunCount = new HashMap<>();
 
-private static int runsForTestWithMatchingException = 0;
+private static final Map verificationCallbackRegistry = 
new HashMap<>();
 
-private static int runsForTestWithSubclassException = 0;
+@BeforeEach
+void incrementMethodRunCount(TestInfo testInfo) {
+// Set or increment the run count for the unit test method, by the 
method short name.
+// This starts at 1 and is incremented before the test starts.
+testInfo.getTestMethod()
+.ifPresent(
+method ->
+methodRunCount.compute(
+method.getName(), (k, v) -> (v == 
null) ? 1 : v + 1));
+}
+
+private static int assertAndReturnRunCount(TestInfo testInfo) {
+return methodRunCount.get(assertAndReturnTestMethodName(testInfo));
+}
 
-private static int runsForPassAfterOneFailure = 0;
+private static void registerCallbackForTest(TestInfo 

(flink-kubernetes-operator) branch dependabot/maven/com.puppycrawl.tools-checkstyle-8.29 created (now 60847c0e)

2023-11-22 Thread mxm
This is an automated email from the ASF dual-hosted git repository.

mxm pushed a change to branch 
dependabot/maven/com.puppycrawl.tools-checkstyle-8.29
in repository https://gitbox.apache.org/repos/asf/flink-kubernetes-operator.git


  at 60847c0e Bump checkstyle from 8.14 to 8.29

No new revisions were added by this update.



(flink-kubernetes-operator) branch dependabot/maven/com.puppycrawl.tools-checkstyle-8.29 updated (60847c0e -> cffd45bf)

2023-11-22 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to branch 
dependabot/maven/com.puppycrawl.tools-checkstyle-8.29
in repository https://gitbox.apache.org/repos/asf/flink-kubernetes-operator.git


omit 60847c0e Bump checkstyle from 8.14 to 8.29
 add a7b8a767 [FLINK-27098] Use namespaced Kubernetes client when creating 
InformerEventSource in session job controller
 add dcc673fa [FLINK-26663] Pod augmentation for the operator
 add 4b11ecdd [FLINK-27005] Bump CRD version to v1beta1
 add 73894818 [FLINK-27065] Store last reconciled specs as string
 add bd835f8f [FLINK-26905] Re-add FlinkDeploymentList and 
FlinkSessionJobList classes
 add a8343521 [FLINK-27141] Improve FlinkService#waitForClusterShutdown 
logic
 add b8721116 [FLINK-27154] Disable web.cancel.enable for application 
clusters
 add c33f7e1a [FLINK-27211] Adds deployments/finalizers for OpenShift 
Deployment
 add ff1ca4e8 [FLINK-26871] Handle session job spec change
 add 56af85a2 [FLINK-26811][docs] Document CRD upgrade process
 add 06dd7867 [FLINK-27124] Flink Kubernetes operator prints starting logs 
with corrent version
 add 22042311 [FLINK-26140] Support rollback strategies
 add bd219691 [FLINK-27269] Clean up the jar file after submitting the job
 add 0c0ae05a [FLINK-27029] DeploymentValidator should take default flink 
config into account during validation
 add dc1cca82 [FLINK-27289] Avoid calling waitForClusterShutdown twice when 
stopping session cluster with deleting HA data
 add e0e34cb4 [FLINK-27310] Fix FlinkOperatorITCase
 add 6e1f0e8c [FLINK-27310] Improve github CI for integration tests
 add 805fef58 [FLINK-27023] Unify flink and operator configuration
 add f7ee710e [FLINK-27161] Support to fetch user jar from different 
sources for session job
 add f520adfc [FLINK-27279] Extract common status interfaces
 add 4559495f [FLINK-27360] Rename clusterId field of FlinkSessionJobSpec 
to deploymentName
 add d1b20f1a [FLINK-26926] Allow users to force upgrade even if savepoint 
is in progress
 add 707102be [FLINK-27358] Fix NPE and avoid unnecessary configmap update
 add 5a81a09b [FLINK-27334] Support auto generate the doc for the 
KubernetesOperatorConfigOptions
 add 85eacb3a [hotfix] fix typo
 add 31ae37cc [hotfix] Harden the ArtifactManagerTest
 add 87062b88 [FLINK-27160] Add e2e tests for session job
 add 247a12f2 [FLINK-27397] Improve the CrdReferenceDoclet to handle the 
abstract class
 add c377ebe2 [FLINK-27303][FLINK-27309] Introduce FlinkConfigManager for 
efficient config management
 add 1cb3eccc [FLINK-27129][docs] Hardcoded namespace in FlinkDeployment 
manifests may fail to deploy
 add 734a0adf [FLINK-27422] Do not create temporary pod template files for 
JobManager and TaskManager if not configured explicitly
 add 9b854ad1 [FLINK-27451] Enable the validator plugin in webhook
 add 399887eb [FLINK-27362] Support restartNonce semantics in session job
 add 86a0d396 [FLINK-27303] Improve config cache settings + add cleanup
 add 5328e6b4 [FLINK-27458] Expose allowNonRestoredState flag in JobSpec
 add a28cb336 [FLINK-26953] Introduce Operator Specific Metrics
 add 6ff7cf31 [FLINK-27262] Enrich validator for FlinkSessionJob
 add de75dcdc [FLINK-27097] Document custom validator implementations
 add b280822b [FLINK-27468] Recover missing deployments and other 
cancel/upgrade improvements for 1.15
 add 52447287 [FLINK-27500] Validation errors should not block 
reconciliation
 add ab1893c6 [hotfix] Specify container in kubectl command
 add 6ce3d969 [FLINK-27412] Allow flinkVersion v1_13 in 
flink-kubernetes-operator and improve e2e tests to cover all supported Flink 
versions
 add 60654f4f [FLINK-27261] Disable 'web.cancel.enable' for session cluster
 add c8a43104 [FLINK-27551] Update status manually instead of relying on 
updatecontrol
 add cf0ebe1e [FLINK-27329] Add default value of replica of JM pod and not 
declare it in example yamls
 add 65e66a86 [FLINK-27036] Exclude final release tags from docker build
 add 9995a5d6 [FLINK-26639] Publish snapshot artifacts nightly
 add fb51b6af [FLINK-27499] Bump base Flink version to 1.15.0
 add d88102ff [FLINK-26639] Snapshot publishing fixes
 add e3fc9cd5 [FLINK-26639][hotfix] Snapshot publishing typo
 add 73369b85 [FLINK-27573] Configuring a new random job result store 
directory
 add af086601 [FLINK-27595] Make security context configurable in helm
 add ae899f64 [FLINK-27495] Observe last savepoint status directly from 
cluster
 add 5239cd6a [FLINK-27337] Prevent session cluster to be deleted when 
there are running jobs
 add 85fc32a8 [FLINK-27270] Add document of session job operations
 add be54f2be [FLINK-27483] Make http artifact fetcher headers configurable
 add 6c9c4525 [FLINK-27614] Use informer in webhook to avoi