(flink) branch master updated: [FLINK-33938][runtime-web] Correct implicit coercions in relational operators to adopt typescript 5.0

2024-01-08 Thread xtsong
This is an automated email from the ASF dual-hosted git repository.

xtsong pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new e07545e458b [FLINK-33938][runtime-web] Correct implicit coercions in 
relational operators to adopt typescript 5.0
e07545e458b is described below

commit e07545e458bd22099244a353ac29477ca3a13811
Author: Laffery 
AuthorDate: Tue Dec 26 11:04:10 2023 +0800

[FLINK-33938][runtime-web] Correct implicit coercions in relational 
operators to adopt typescript 5.0

This closes #23999
---
 .../src/app/components/dagre/components/node/node.component.ts  | 2 +-
 .../web-dashboard/src/app/components/humanize-date.pipe.ts  | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git 
a/flink-runtime-web/web-dashboard/src/app/components/dagre/components/node/node.component.ts
 
b/flink-runtime-web/web-dashboard/src/app/components/dagre/components/node/node.component.ts
index b862b63777a..368a7053603 100644
--- 
a/flink-runtime-web/web-dashboard/src/app/components/dagre/components/node/node.component.ts
+++ 
b/flink-runtime-web/web-dashboard/src/app/components/dagre/components/node/node.component.ts
@@ -80,7 +80,7 @@ export class NodeComponent {
   }
 
   isValid = (value?: number): boolean => {
-return !!value || value === 0 || value === NaN;
+return value === undefined || value === 0 || isNaN(value);
   };
 
   toRGBA = (d: string): number[] => {
diff --git 
a/flink-runtime-web/web-dashboard/src/app/components/humanize-date.pipe.ts 
b/flink-runtime-web/web-dashboard/src/app/components/humanize-date.pipe.ts
index ee76a0acbb8..d303dbe700e 100644
--- a/flink-runtime-web/web-dashboard/src/app/components/humanize-date.pipe.ts
+++ b/flink-runtime-web/web-dashboard/src/app/components/humanize-date.pipe.ts
@@ -32,7 +32,7 @@ export class HumanizeDatePipe implements PipeTransform {
 timezone?: string,
 locale?: string
   ): string | null | undefined {
-if (value == null || value === '' || value !== value || value < 0) {
+if (value == null || value === '' || value !== value || +value < 0) {
   return '–';
 }
 



(flink) branch release-1.17 updated: [FLINK-33938][runtime-web] Correct implicit coercions in relational operators to adopt typescript 5.0

2024-01-08 Thread xtsong
This is an automated email from the ASF dual-hosted git repository.

xtsong pushed a commit to branch release-1.17
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.17 by this push:
 new ce62d454774 [FLINK-33938][runtime-web] Correct implicit coercions in 
relational operators to adopt typescript 5.0
ce62d454774 is described below

commit ce62d45477447537088930ec116f7e18a2743166
Author: Laffery 
AuthorDate: Tue Dec 26 11:04:10 2023 +0800

[FLINK-33938][runtime-web] Correct implicit coercions in relational 
operators to adopt typescript 5.0

This closes #23999
---
 .../src/app/components/dagre/components/node/node.component.ts  | 2 +-
 .../web-dashboard/src/app/components/humanize-date.pipe.ts  | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git 
a/flink-runtime-web/web-dashboard/src/app/components/dagre/components/node/node.component.ts
 
b/flink-runtime-web/web-dashboard/src/app/components/dagre/components/node/node.component.ts
index b862b63777a..368a7053603 100644
--- 
a/flink-runtime-web/web-dashboard/src/app/components/dagre/components/node/node.component.ts
+++ 
b/flink-runtime-web/web-dashboard/src/app/components/dagre/components/node/node.component.ts
@@ -80,7 +80,7 @@ export class NodeComponent {
   }
 
   isValid = (value?: number): boolean => {
-return !!value || value === 0 || value === NaN;
+return value === undefined || value === 0 || isNaN(value);
   };
 
   toRGBA = (d: string): number[] => {
diff --git 
a/flink-runtime-web/web-dashboard/src/app/components/humanize-date.pipe.ts 
b/flink-runtime-web/web-dashboard/src/app/components/humanize-date.pipe.ts
index ee76a0acbb8..d303dbe700e 100644
--- a/flink-runtime-web/web-dashboard/src/app/components/humanize-date.pipe.ts
+++ b/flink-runtime-web/web-dashboard/src/app/components/humanize-date.pipe.ts
@@ -32,7 +32,7 @@ export class HumanizeDatePipe implements PipeTransform {
 timezone?: string,
 locale?: string
   ): string | null | undefined {
-if (value == null || value === '' || value !== value || value < 0) {
+if (value == null || value === '' || value !== value || +value < 0) {
   return '–';
 }
 



(flink) branch release-1.18 updated: [FLINK-33938][runtime-web] Correct implicit coercions in relational operators to adopt typescript 5.0

2024-01-08 Thread xtsong
This is an automated email from the ASF dual-hosted git repository.

xtsong pushed a commit to branch release-1.18
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.18 by this push:
 new 12463fbad39 [FLINK-33938][runtime-web] Correct implicit coercions in 
relational operators to adopt typescript 5.0
12463fbad39 is described below

commit 12463fbad39edc17af687c1421bba4623f924083
Author: Laffery 
AuthorDate: Tue Dec 26 11:04:10 2023 +0800

[FLINK-33938][runtime-web] Correct implicit coercions in relational 
operators to adopt typescript 5.0

This closes #23999
---
 .../src/app/components/dagre/components/node/node.component.ts  | 2 +-
 .../web-dashboard/src/app/components/humanize-date.pipe.ts  | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git 
a/flink-runtime-web/web-dashboard/src/app/components/dagre/components/node/node.component.ts
 
b/flink-runtime-web/web-dashboard/src/app/components/dagre/components/node/node.component.ts
index b862b63777a..368a7053603 100644
--- 
a/flink-runtime-web/web-dashboard/src/app/components/dagre/components/node/node.component.ts
+++ 
b/flink-runtime-web/web-dashboard/src/app/components/dagre/components/node/node.component.ts
@@ -80,7 +80,7 @@ export class NodeComponent {
   }
 
   isValid = (value?: number): boolean => {
-return !!value || value === 0 || value === NaN;
+return value === undefined || value === 0 || isNaN(value);
   };
 
   toRGBA = (d: string): number[] => {
diff --git 
a/flink-runtime-web/web-dashboard/src/app/components/humanize-date.pipe.ts 
b/flink-runtime-web/web-dashboard/src/app/components/humanize-date.pipe.ts
index ee76a0acbb8..d303dbe700e 100644
--- a/flink-runtime-web/web-dashboard/src/app/components/humanize-date.pipe.ts
+++ b/flink-runtime-web/web-dashboard/src/app/components/humanize-date.pipe.ts
@@ -32,7 +32,7 @@ export class HumanizeDatePipe implements PipeTransform {
 timezone?: string,
 locale?: string
   ): string | null | undefined {
-if (value == null || value === '' || value !== value || value < 0) {
+if (value == null || value === '' || value !== value || +value < 0) {
   return '–';
 }
 



(flink-benchmarks) branch master updated: [hotfix] Extract common preparing directory code into one function for state benchmark

2024-01-08 Thread hangxiang
This is an automated email from the ASF dual-hosted git repository.

hangxiang pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-benchmarks.git


The following commit(s) were added to refs/heads/master by this push:
 new 8b0b1c6  [hotfix] Extract common preparing directory code into one 
function for state benchmark
8b0b1c6 is described below

commit 8b0b1c620107282548c8b6014eb303a10c79d921
Author: Zakelly 
AuthorDate: Sun Jan 7 11:31:47 2024 +0800

[hotfix] Extract common preparing directory code into one function for 
state benchmark
---
 .../state/benchmark/RescalingBenchmarkBase.java  | 20 +++-
 .../flink/state/benchmark/StateBenchmarkBase.java|  6 +-
 2 files changed, 8 insertions(+), 18 deletions(-)

diff --git 
a/src/main/java/org/apache/flink/state/benchmark/RescalingBenchmarkBase.java 
b/src/main/java/org/apache/flink/state/benchmark/RescalingBenchmarkBase.java
index ef03389..be25fd7 100644
--- a/src/main/java/org/apache/flink/state/benchmark/RescalingBenchmarkBase.java
+++ b/src/main/java/org/apache/flink/state/benchmark/RescalingBenchmarkBase.java
@@ -46,6 +46,8 @@ import java.util.Arrays;
 import java.util.Iterator;
 import java.util.Random;
 
+import static 
org.apache.flink.state.benchmark.StateBenchmarkBase.createStateDataDir;
+
 public class RescalingBenchmarkBase extends BenchmarkBase {
 
 @Param({"RESCALE_IN", "RESCALE_OUT"})
@@ -64,23 +66,7 @@ public class RescalingBenchmarkBase extends BenchmarkBase {
 }
 
 protected static File prepareDirectory(String prefix) throws IOException {
-Configuration benchMarkConfig = ConfigUtil.loadBenchMarkConf();
-String stateDataDirPath = 
benchMarkConfig.getString(StateBenchmarkOptions.STATE_DATA_DIR);
-File dataDir = null;
-if (stateDataDirPath != null) {
-dataDir = new File(stateDataDirPath);
-if (!dataDir.exists()) {
-Files.createDirectories(Paths.get(stateDataDirPath));
-}
-}
-File target = File.createTempFile(prefix, "", dataDir);
-if (target.exists() && !target.delete()) {
-throw new IOException("Target dir {" + target.getAbsolutePath() + 
"} exists but failed to clean it up");
-} else if (!target.mkdirs()) {
-throw new IOException("Failed to create target directory: " + 
target.getAbsolutePath());
-} else {
-return target;
-}
+return StateBackendBenchmarkUtils.prepareDirectory(prefix, 
createStateDataDir());
 }
 
 @State(Scope.Thread)
diff --git 
a/src/main/java/org/apache/flink/state/benchmark/StateBenchmarkBase.java 
b/src/main/java/org/apache/flink/state/benchmark/StateBenchmarkBase.java
index 99e9c48..345063c 100644
--- a/src/main/java/org/apache/flink/state/benchmark/StateBenchmarkBase.java
+++ b/src/main/java/org/apache/flink/state/benchmark/StateBenchmarkBase.java
@@ -67,6 +67,10 @@ public class StateBenchmarkBase extends BenchmarkBase {
 }
 
 protected KeyedStateBackend createKeyedStateBackend(TtlTimeProvider 
ttlTimeProvider) throws Exception {
+return StateBackendBenchmarkUtils.createKeyedStateBackend(backendType, 
createStateDataDir());
+}
+
+public static File createStateDataDir() throws IOException {
 Configuration benchMarkConfig = ConfigUtil.loadBenchMarkConf();
 String stateDataDirPath = 
benchMarkConfig.getString(StateBenchmarkOptions.STATE_DATA_DIR);
 File dataDir = null;
@@ -76,7 +80,7 @@ public class StateBenchmarkBase extends BenchmarkBase {
 Files.createDirectories(Paths.get(stateDataDirPath));
 }
 }
-return StateBackendBenchmarkUtils.createKeyedStateBackend(backendType, 
dataDir);
+return dataDir;
 }
 
 private static int getCurrentIndex() {



(flink) branch master updated: [FLINK-33414] Fix the unstable test MiniClusterITCase.testHandleStreamingJobsWhenNotEnoughSlot

2024-01-08 Thread mapohl
This is an automated email from the ASF dual-hosted git repository.

mapohl pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new f45de0c6de8 [FLINK-33414] Fix the unstable test 
MiniClusterITCase.testHandleStreamingJobsWhenNotEnoughSlot
f45de0c6de8 is described below

commit f45de0c6de87c182a49f2a979884d3d2f66f870f
Author: jiangxin 
AuthorDate: Mon Jan 8 20:32:05 2024 +0800

[FLINK-33414] Fix the unstable test 
MiniClusterITCase.testHandleStreamingJobsWhenNotEnoughSlot
---
 .../flink/runtime/minicluster/MiniClusterITCase.java   | 14 --
 1 file changed, 12 insertions(+), 2 deletions(-)

diff --git 
a/flink-runtime/src/test/java/org/apache/flink/runtime/minicluster/MiniClusterITCase.java
 
b/flink-runtime/src/test/java/org/apache/flink/runtime/minicluster/MiniClusterITCase.java
index e18fe24281d..ba3ca9b5237 100644
--- 
a/flink-runtime/src/test/java/org/apache/flink/runtime/minicluster/MiniClusterITCase.java
+++ 
b/flink-runtime/src/test/java/org/apache/flink/runtime/minicluster/MiniClusterITCase.java
@@ -23,6 +23,7 @@ import org.apache.flink.api.common.JobSubmissionResult;
 import org.apache.flink.api.common.restartstrategy.RestartStrategies;
 import org.apache.flink.configuration.Configuration;
 import org.apache.flink.configuration.JobManagerOptions;
+import org.apache.flink.configuration.ResourceManagerOptions;
 import org.apache.flink.runtime.client.JobExecutionException;
 import org.apache.flink.runtime.io.network.partition.ResultPartitionType;
 import org.apache.flink.runtime.jobgraph.DistributionPattern;
@@ -123,10 +124,19 @@ class MiniClusterITCase {
 
 private void runHandleJobsWhenNotEnoughSlots(final JobGraph jobGraph) 
throws Exception {
 final Configuration configuration = new Configuration();
+
+// the slot timeout needs to be high enough to avoid causing 
TimeoutException
+Duration slotRequestTimeout = Duration.ofNanos(Long.MAX_VALUE);
+
 // this triggers the failure for the default scheduler
-configuration.setLong(JobManagerOptions.SLOT_REQUEST_TIMEOUT, 100L);
+configuration.setLong(
+JobManagerOptions.SLOT_REQUEST_TIMEOUT, 
slotRequestTimeout.toMillis());
 // this triggers the failure for the adaptive scheduler
-configuration.set(JobManagerOptions.RESOURCE_WAIT_TIMEOUT, 
Duration.ofMillis(100));
+configuration.set(JobManagerOptions.RESOURCE_WAIT_TIMEOUT, 
slotRequestTimeout);
+
+// cluster startup relies on SLOT_REQUEST_TIMEOUT as a fallback if the 
following parameter
+// is not set which causes the test to take longer
+
configuration.set(ResourceManagerOptions.STANDALONE_CLUSTER_STARTUP_PERIOD_TIME,
 1L);
 
 final MiniClusterConfiguration cfg =
 new MiniClusterConfiguration.Builder()



(flink-connector-aws) branch v4.2 updated: [FLINK-34017][docs] Connectors docs with 404 link in dynamodb.md (#123)

2024-01-08 Thread dannycranmer
This is an automated email from the ASF dual-hosted git repository.

dannycranmer pushed a commit to branch v4.2
in repository https://gitbox.apache.org/repos/asf/flink-connector-aws.git


The following commit(s) were added to refs/heads/v4.2 by this push:
 new 7a7a6a0  [FLINK-34017][docs] Connectors docs with 404 link in 
dynamodb.md (#123)
7a7a6a0 is described below

commit 7a7a6a0f0b8274b3196a82cfb2cd3ef0776d967f
Author: gongzhongqiang 
AuthorDate: Mon Jan 8 21:52:03 2024 +0800

[FLINK-34017][docs] Connectors docs with 404 link in dynamodb.md (#123)
---
 docs/content.zh/docs/connectors/datastream/dynamodb.md | 2 +-
 docs/content/docs/connectors/datastream/dynamodb.md| 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/content.zh/docs/connectors/datastream/dynamodb.md 
b/docs/content.zh/docs/connectors/datastream/dynamodb.md
index 477b543..def608c 100644
--- a/docs/content.zh/docs/connectors/datastream/dynamodb.md
+++ b/docs/content.zh/docs/connectors/datastream/dynamodb.md
@@ -136,7 +136,7 @@ An element converter is used to convert from a record in 
the DataStream to a Dyn
 `DynamoDbBeanElementConverter` when you are working with `@DynamoDbBean` 
objects. For more information on supported 
 annotations see 
[here](https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/examples-dynamodb-enhanced.html#dynamodb-enhanced-mapper-tableschema).
 
-A sample application using a custom `ElementConverter` can be found 
[here](https://github.com/apache/flink-connector-aws/blob/main/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dynamodb/sink/examples/SinkIntoDynamoDb.java).
 A sample application using the `DynamoDbBeanElementConverter` can be found 
[here](https://github.com/apache/flink-connector-aws/blob/main/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dynamodb/sink/examples/SinkDynamoDbBeanIntoD
 [...]
+A sample application using a custom `ElementConverter` can be found 
[here](https://github.com/apache/flink-connector-aws/blob/main/flink-connector-aws/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dynamodb/sink/examples/SinkIntoDynamoDb.java).
 A sample application using the `DynamoDbBeanElementConverter` can be found 
[here](https://github.com/apache/flink-connector-aws/blob/main/flink-connector-aws/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dyna
 [...]
 
 ## Using Custom DynamoDB Endpoints
 
diff --git a/docs/content/docs/connectors/datastream/dynamodb.md 
b/docs/content/docs/connectors/datastream/dynamodb.md
index caa65c7..d0f6d7c 100644
--- a/docs/content/docs/connectors/datastream/dynamodb.md
+++ b/docs/content/docs/connectors/datastream/dynamodb.md
@@ -136,7 +136,7 @@ An element converter is used to convert from a record in 
the DataStream to a Dyn
 `DynamoDbBeanElementConverter` when you are working with `@DynamoDbBean` 
objects. For more information on supported 
 annotations see 
[here](https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/examples-dynamodb-enhanced.html#dynamodb-enhanced-mapper-tableschema).
 
-A sample application using a custom `ElementConverter` can be found 
[here](https://github.com/apache/flink-connector-aws/blob/main/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dynamodb/sink/examples/SinkIntoDynamoDb.java).
 A sample application using the `DynamoDbBeanElementConverter` can be found 
[here](https://github.com/apache/flink-connector-aws/blob/main/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dynamodb/sink/examples/SinkDynamoDbBeanIntoD
 [...]
+A sample application using a custom `ElementConverter` can be found 
[here](https://github.com/apache/flink-connector-aws/blob/main/flink-connector-aws/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dynamodb/sink/examples/SinkIntoDynamoDb.java).
 A sample application using the `DynamoDbBeanElementConverter` can be found 
[here](https://github.com/apache/flink-connector-aws/blob/main/flink-connector-aws/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dyna
 [...]
 
 ## Using Custom DynamoDB Endpoints
 



(flink-connector-aws) branch main updated: [FLINK-34017][docs] Connectors docs with 404 link in dynamodb.md (#124)

2024-01-08 Thread dannycranmer
This is an automated email from the ASF dual-hosted git repository.

dannycranmer pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-aws.git


The following commit(s) were added to refs/heads/main by this push:
 new 38aafb4  [FLINK-34017][docs] Connectors docs with 404 link in 
dynamodb.md (#124)
38aafb4 is described below

commit 38aafb44d3a8200e4ff41d87e0780338f40da258
Author: gongzhongqiang 
AuthorDate: Mon Jan 8 21:51:28 2024 +0800

[FLINK-34017][docs] Connectors docs with 404 link in dynamodb.md (#124)
---
 docs/content.zh/docs/connectors/datastream/dynamodb.md | 2 +-
 docs/content/docs/connectors/datastream/dynamodb.md| 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/content.zh/docs/connectors/datastream/dynamodb.md 
b/docs/content.zh/docs/connectors/datastream/dynamodb.md
index 477b543..def608c 100644
--- a/docs/content.zh/docs/connectors/datastream/dynamodb.md
+++ b/docs/content.zh/docs/connectors/datastream/dynamodb.md
@@ -136,7 +136,7 @@ An element converter is used to convert from a record in 
the DataStream to a Dyn
 `DynamoDbBeanElementConverter` when you are working with `@DynamoDbBean` 
objects. For more information on supported 
 annotations see 
[here](https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/examples-dynamodb-enhanced.html#dynamodb-enhanced-mapper-tableschema).
 
-A sample application using a custom `ElementConverter` can be found 
[here](https://github.com/apache/flink-connector-aws/blob/main/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dynamodb/sink/examples/SinkIntoDynamoDb.java).
 A sample application using the `DynamoDbBeanElementConverter` can be found 
[here](https://github.com/apache/flink-connector-aws/blob/main/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dynamodb/sink/examples/SinkDynamoDbBeanIntoD
 [...]
+A sample application using a custom `ElementConverter` can be found 
[here](https://github.com/apache/flink-connector-aws/blob/main/flink-connector-aws/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dynamodb/sink/examples/SinkIntoDynamoDb.java).
 A sample application using the `DynamoDbBeanElementConverter` can be found 
[here](https://github.com/apache/flink-connector-aws/blob/main/flink-connector-aws/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dyna
 [...]
 
 ## Using Custom DynamoDB Endpoints
 
diff --git a/docs/content/docs/connectors/datastream/dynamodb.md 
b/docs/content/docs/connectors/datastream/dynamodb.md
index caa65c7..d0f6d7c 100644
--- a/docs/content/docs/connectors/datastream/dynamodb.md
+++ b/docs/content/docs/connectors/datastream/dynamodb.md
@@ -136,7 +136,7 @@ An element converter is used to convert from a record in 
the DataStream to a Dyn
 `DynamoDbBeanElementConverter` when you are working with `@DynamoDbBean` 
objects. For more information on supported 
 annotations see 
[here](https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/examples-dynamodb-enhanced.html#dynamodb-enhanced-mapper-tableschema).
 
-A sample application using a custom `ElementConverter` can be found 
[here](https://github.com/apache/flink-connector-aws/blob/main/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dynamodb/sink/examples/SinkIntoDynamoDb.java).
 A sample application using the `DynamoDbBeanElementConverter` can be found 
[here](https://github.com/apache/flink-connector-aws/blob/main/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dynamodb/sink/examples/SinkDynamoDbBeanIntoD
 [...]
+A sample application using a custom `ElementConverter` can be found 
[here](https://github.com/apache/flink-connector-aws/blob/main/flink-connector-aws/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dynamodb/sink/examples/SinkIntoDynamoDb.java).
 A sample application using the `DynamoDbBeanElementConverter` can be found 
[here](https://github.com/apache/flink-connector-aws/blob/main/flink-connector-aws/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dyna
 [...]
 
 ## Using Custom DynamoDB Endpoints
 



(flink) branch master updated: [FLINK-34001][docs][table] Fix ambiguous document description towards the default value for configuring operator-level state TTL

2024-01-08 Thread jchan
This is an automated email from the ASF dual-hosted git repository.

jchan pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 3171553d407 [FLINK-34001][docs][table] Fix ambiguous document 
description towards the default value for configuring operator-level state TTL
3171553d407 is described below

commit 3171553d407d9560b30464ab4950d0181f08c8e1
Author: Jane Chan 
AuthorDate: Mon Jan 8 20:23:51 2024 +0800

[FLINK-34001][docs][table] Fix ambiguous document description towards the 
default value for configuring operator-level state TTL

This closes  #24039
---
 docs/content/docs/dev/table/concepts/overview.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/content/docs/dev/table/concepts/overview.md 
b/docs/content/docs/dev/table/concepts/overview.md
index 08518cf2cb6..4deee64d56b 100644
--- a/docs/content/docs/dev/table/concepts/overview.md
+++ b/docs/content/docs/dev/table/concepts/overview.md
@@ -508,7 +508,7 @@ It performs a regular inner join with different state TTL 
for left and right sid
   ]
 ```
 The `"index"` indicates the current state is the i-th input the operator, 
and the index starts from zero.
-The current TTL value for both left and right side is `"0 ms"`, which means 
the state retention is not enabled. 
+The current TTL value for both left and right side is `"0 ms"`, which means 
the state never expires.
 Now change the value of left state to `"3000 ms"` and right state to `"9000 
ms"`.
 ```json
 "state": [



(flink) 02/02: [FLINK-34000] Remove IncrementalGroupAgg Json Plan & IT tests

2024-01-08 Thread dwysakowicz
This is an automated email from the ASF dual-hosted git repository.

dwysakowicz pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 0df5ab5a3318d21e8be3ab9237900664e3741013
Author: bvarghese1 
AuthorDate: Thu Jan 4 20:07:47 2024 -0800

[FLINK-34000] Remove IncrementalGroupAgg Json Plan & IT tests

- These are covered by the restore tests
---
 .../stream/IncrementalAggregateJsonPlanTest.java   | 106 
 .../IncrementalAggregateJsonPlanITCase.java|  78 ---
 .../testIncrementalAggregate.out   | 401 --
 ...lAggregateWithSumCountDistinctAndRetraction.out | 585 -
 4 files changed, 1170 deletions(-)

diff --git 
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IncrementalAggregateJsonPlanTest.java
 
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IncrementalAggregateJsonPlanTest.java
deleted file mode 100644
index 26dcc04f303..000
--- 
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IncrementalAggregateJsonPlanTest.java
+++ /dev/null
@@ -1,106 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.flink.table.planner.plan.nodes.exec.stream;
-
-import org.apache.flink.table.api.TableConfig;
-import org.apache.flink.table.api.TableEnvironment;
-import org.apache.flink.table.api.config.ExecutionConfigOptions;
-import org.apache.flink.table.api.config.OptimizerConfigOptions;
-import 
org.apache.flink.table.planner.plan.rules.physical.stream.IncrementalAggregateRule;
-import org.apache.flink.table.planner.utils.AggregatePhaseStrategy;
-import org.apache.flink.table.planner.utils.StreamTableTestUtil;
-import org.apache.flink.table.planner.utils.TableTestBase;
-
-import org.junit.jupiter.api.BeforeEach;
-import org.junit.jupiter.api.Test;
-
-import java.time.Duration;
-
-/** Test json serialization/deserialization for incremental aggregate. */
-class IncrementalAggregateJsonPlanTest extends TableTestBase {
-
-private StreamTableTestUtil util;
-private TableEnvironment tEnv;
-
-@BeforeEach
-void setup() {
-util = streamTestUtil(TableConfig.getDefault());
-tEnv = util.getTableEnv();
-tEnv.getConfig()
-.set(
-
OptimizerConfigOptions.TABLE_OPTIMIZER_AGG_PHASE_STRATEGY,
-AggregatePhaseStrategy.TWO_PHASE.name())
-
.set(OptimizerConfigOptions.TABLE_OPTIMIZER_DISTINCT_AGG_SPLIT_ENABLED, true)
-.set(ExecutionConfigOptions.TABLE_EXEC_MINIBATCH_ENABLED, true)
-.set(
-
ExecutionConfigOptions.TABLE_EXEC_MINIBATCH_ALLOW_LATENCY,
-Duration.ofSeconds(10))
-.set(ExecutionConfigOptions.TABLE_EXEC_MINIBATCH_SIZE, 5L)
-
.set(IncrementalAggregateRule.TABLE_OPTIMIZER_INCREMENTAL_AGG_ENABLED(), true);
-
-String srcTableDdl =
-"CREATE TABLE MyTable (\n"
-+ "  a bigint,\n"
-+ "  b int not null,\n"
-+ "  c varchar,\n"
-+ "  d bigint\n"
-+ ") with (\n"
-+ "  'connector' = 'values',\n"
-+ "  'bounded' = 'false')";
-tEnv.executeSql(srcTableDdl);
-}
-
-@Test
-void testIncrementalAggregate() {
-String sinkTableDdl =
-"CREATE TABLE MySink (\n"
-+ "  a bigint,\n"
-+ "  c bigint\n"
-+ ") with (\n"
-+ "  'connector' = 'values',\n"
-+ "  'sink-insert-only' = 'false',\n"
-+ "  'table-sink-class' = 'DEFAULT')";
-tEnv.executeSql(sinkTableDdl);
-util.verifyJsonPlan(
-"insert into MySink select a, "
-+ "count(distinct c) as c "
-+ "from MyTable group by a");
-}
-
-@Test
-void 

(flink) 01/02: [FLINK-34000] Implement restore tests for IncrementalGroupAgg node

2024-01-08 Thread dwysakowicz
This is an automated email from the ASF dual-hosted git repository.

dwysakowicz pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit df71d07188e745553b8174297ec7989f05cebf7a
Author: bvarghese1 
AuthorDate: Thu Jan 4 20:05:38 2024 -0800

[FLINK-34000] Implement restore tests for IncrementalGroupAgg node
---
 .../IncrementalGroupAggregateRestoreTest.java  |  40 ++
 .../IncrementalGroupAggregateTestPrograms.java | 119 +
 .../plan/incremental-group-aggregate-complex.json  | 573 +
 .../savepoint/_metadata| Bin 0 -> 20817 bytes
 .../plan/incremental-group-aggregate-simple.json   | 373 ++
 .../savepoint/_metadata| Bin 0 -> 14768 bytes
 6 files changed, 1105 insertions(+)

diff --git 
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IncrementalGroupAggregateRestoreTest.java
 
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IncrementalGroupAggregateRestoreTest.java
new file mode 100644
index 000..250f50a38c7
--- /dev/null
+++ 
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IncrementalGroupAggregateRestoreTest.java
@@ -0,0 +1,40 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.stream;
+
+import 
org.apache.flink.table.planner.plan.nodes.exec.testutils.RestoreTestBase;
+import org.apache.flink.table.test.program.TableTestProgram;
+
+import java.util.Arrays;
+import java.util.List;
+
+/** Restore tests for {@link StreamExecIncrementalGroupAggregate}. */
+public class IncrementalGroupAggregateRestoreTest extends RestoreTestBase {
+
+public IncrementalGroupAggregateRestoreTest() {
+super(StreamExecIncrementalGroupAggregate.class);
+}
+
+@Override
+public List programs() {
+return Arrays.asList(
+
IncrementalGroupAggregateTestPrograms.INCREMENTAL_GROUP_AGGREGATE_SIMPLE,
+
IncrementalGroupAggregateTestPrograms.INCREMENTAL_GROUP_AGGREGATE_COMPLEX);
+}
+}
diff --git 
a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IncrementalGroupAggregateTestPrograms.java
 
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IncrementalGroupAggregateTestPrograms.java
new file mode 100644
index 000..a1ca086d258
--- /dev/null
+++ 
b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IncrementalGroupAggregateTestPrograms.java
@@ -0,0 +1,119 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.stream;
+
+import org.apache.flink.table.api.config.ExecutionConfigOptions;
+import org.apache.flink.table.api.config.OptimizerConfigOptions;
+import org.apache.flink.table.test.program.SinkTestStep;
+import org.apache.flink.table.test.program.SourceTestStep;
+import org.apache.flink.table.test.program.TableTestProgram;
+import org.apache.flink.types.Row;
+
+import java.time.Duration;
+
+/** {@link TableTestProgram} definitions for testing {@link 
StreamExecGroupAggregate}. */
+public class IncrementalGroupAggregateTestPrograms {
+
+static final Row[] BEFORE_DATA = {
+   

(flink) branch master updated (ed79a1fc312 -> 0df5ab5a331)

2024-01-08 Thread dwysakowicz
This is an automated email from the ASF dual-hosted git repository.

dwysakowicz pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


from ed79a1fc312 [hotfix][api] Adds @PublicEvolving to StateTtlConfig inner 
classes to remove ArchUnit exclusions
 new df71d07188e [FLINK-34000] Implement restore tests for 
IncrementalGroupAgg node
 new 0df5ab5a331 [FLINK-34000] Remove IncrementalGroupAgg Json Plan & IT 
tests

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../stream/IncrementalAggregateJsonPlanTest.java   | 106 -
 ...a => IncrementalGroupAggregateRestoreTest.java} |  12 +-
 .../IncrementalGroupAggregateTestPrograms.java | 119 ++
 .../IncrementalAggregateJsonPlanITCase.java|  78 -
 .../plan/incremental-group-aggregate-complex.json} | 176 ++---
 .../savepoint/_metadata| Bin 0 -> 20817 bytes
 .../plan/incremental-group-aggregate-simple.json}  | 116 ++
 .../savepoint/_metadata| Bin 0 -> 14768 bytes
 8 files changed, 251 insertions(+), 356 deletions(-)
 delete mode 100644 
flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IncrementalAggregateJsonPlanTest.java
 copy 
flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/{TemporalJoinRestoreTest.java
 => IncrementalGroupAggregateRestoreTest.java} (72%)
 create mode 100644 
flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/stream/IncrementalGroupAggregateTestPrograms.java
 delete mode 100644 
flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/runtime/stream/jsonplan/IncrementalAggregateJsonPlanITCase.java
 rename 
flink-table/flink-table-planner/src/test/resources/{org/apache/flink/table/planner/plan/nodes/exec/stream/IncrementalAggregateJsonPlanTest_jsonplan/testIncrementalAggregateWithSumCountDistinctAndRetraction.out
 => 
restore-tests/stream-exec-incremental-group-aggregate_1/incremental-group-aggregate-complex/plan/incremental-group-aggregate-complex.json}
 (81%)
 create mode 100644 
flink-table/flink-table-planner/src/test/resources/restore-tests/stream-exec-incremental-group-aggregate_1/incremental-group-aggregate-complex/savepoint/_metadata
 rename 
flink-table/flink-table-planner/src/test/resources/{org/apache/flink/table/planner/plan/nodes/exec/stream/IncrementalAggregateJsonPlanTest_jsonplan/testIncrementalAggregate.out
 => 
restore-tests/stream-exec-incremental-group-aggregate_1/incremental-group-aggregate-simple/plan/incremental-group-aggregate-simple.json}
 (71%)
 create mode 100644 
flink-table/flink-table-planner/src/test/resources/restore-tests/stream-exec-incremental-group-aggregate_1/incremental-group-aggregate-simple/savepoint/_metadata



(flink-connector-elasticsearch) branch main updated: [FLINK-34002] Bump CI flink version on flink-connector-elasticsearch to support Flink 1.19. This closes #86

2024-01-08 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository 
https://gitbox.apache.org/repos/asf/flink-connector-elasticsearch.git


The following commit(s) were added to refs/heads/main by this push:
 new 52f2afd  [FLINK-34002] Bump CI flink version on 
flink-connector-elasticsearch to support Flink 1.19. This closes #86
52f2afd is described below

commit 52f2afdda90d4d8881d64e6d13c8c8f8b94eb922
Author: Martijn Visser <2989614+martijnvis...@users.noreply.github.com>
AuthorDate: Mon Jan 8 13:07:24 2024 +0100

[FLINK-34002] Bump CI flink version on flink-connector-elasticsearch to 
support Flink 1.19. This closes #86

* [FLINK-34002] Bump CI Flink version on flink-connector-elasticsearch to 
support Flink 1.19

* [FLINK-34002] Java 17 support for connector

* Make tests passing for jdk 17+

-

Co-authored-by: Sergey Nuyanzin 
---
 .github/workflows/push_pr.yml| 9 -
 .github/workflows/weekly.yml | 7 +++
 .../flink-connector-elasticsearch6-e2e-tests/pom.xml | 2 ++
 .../flink-connector-elasticsearch7-e2e-tests/pom.xml | 2 ++
 flink-connector-elasticsearch-e2e-tests/pom.xml  | 6 ++
 flink-connector-elasticsearch6/pom.xml   | 2 ++
 flink-connector-elasticsearch7/pom.xml   | 2 ++
 pom.xml  | 3 +++
 8 files changed, 32 insertions(+), 1 deletion(-)

diff --git a/.github/workflows/push_pr.yml b/.github/workflows/push_pr.yml
index 0527357..9d349c0 100644
--- a/.github/workflows/push_pr.yml
+++ b/.github/workflows/push_pr.yml
@@ -25,7 +25,14 @@ jobs:
   compile_and_test:
 strategy:
   matrix:
-flink: [1.17.1, 1.18.0, 1.19-SNAPSHOT]
+flink: [ 1.16-SNAPSHOT, 1.17-SNAPSHOT ]
+jdk: [ '8, 11' ]
+include:
+  - flink: 1.18-SNAPSHOT
+jdk: '8, 11, 17'
+  - flink: 1.19-SNAPSHOT
+jdk: '8, 11, 17, 21'
 uses: apache/flink-connector-shared-utils/.github/workflows/ci.yml@ci_utils
 with:
   flink_version: ${{ matrix.flink }}
+  jdk_version: ${{ matrix.jdk }}
diff --git a/.github/workflows/weekly.yml b/.github/workflows/weekly.yml
index f5b46f0..19904b2 100644
--- a/.github/workflows/weekly.yml
+++ b/.github/workflows/weekly.yml
@@ -34,9 +34,11 @@ jobs:
   branch: main
 }, {
   flink: 1.18-SNAPSHOT,
+  jdk: '8, 11, 17',
   branch: main
 }, {
   flink: 1.19-SNAPSHOT,
+  jdk: '8, 11, 17, 21',
   branch: main
 }, {
   flink: 1.16.2,
@@ -44,9 +46,14 @@ jobs:
 }, {
   flink: 1.17.1,
   branch: v3.0
+}, {
+  flink: 1.18.0,
+  jdk: '8, 11, 17',
+  branch: v3.0
 }]
 uses: apache/flink-connector-shared-utils/.github/workflows/ci.yml@ci_utils
 with:
   flink_version: ${{ matrix.flink_branches.flink }}
   connector_branch: ${{ matrix.flink_branches.branch }}
+  jdk_version: ${{ matrix.flink_branches.jdk || '8, 11' }}
   run_dependency_convergence: false
diff --git 
a/flink-connector-elasticsearch-e2e-tests/flink-connector-elasticsearch6-e2e-tests/pom.xml
 
b/flink-connector-elasticsearch-e2e-tests/flink-connector-elasticsearch6-e2e-tests/pom.xml
index a4bf490..d778c61 100644
--- 
a/flink-connector-elasticsearch-e2e-tests/flink-connector-elasticsearch6-e2e-tests/pom.xml
+++ 
b/flink-connector-elasticsearch-e2e-tests/flink-connector-elasticsearch6-e2e-tests/pom.xml
@@ -35,6 +35,8 @@ under the License.
 

6.8.20
+ 
--add-opens=java.base/java.util=ALL-UNNAMED 

 

diff --git 
a/flink-connector-elasticsearch-e2e-tests/flink-connector-elasticsearch7-e2e-tests/pom.xml
 
b/flink-connector-elasticsearch-e2e-tests/flink-connector-elasticsearch7-e2e-tests/pom.xml
index 78ec83a..617be40 100644
--- 
a/flink-connector-elasticsearch-e2e-tests/flink-connector-elasticsearch7-e2e-tests/pom.xml
+++ 
b/flink-connector-elasticsearch-e2e-tests/flink-connector-elasticsearch7-e2e-tests/pom.xml
@@ -35,6 +35,8 @@ under the License.
 

7.10.2
+ 
--add-opens=java.base/java.util=ALL-UNNAMED 

 

diff --git a/flink-connector-elasticsearch-e2e-tests/pom.xml 
b/flink-connector-elasticsearch-e2e-tests/pom.xml
index 1114edc..c7f8a4b 100644
--- a/flink-connector-elasticsearch-e2e-tests/pom.xml
+++ b/flink-connector-elasticsearch-e2e-tests/pom.xml
@@ -39,6 +39,11 @@ under the License.
flink-connector-elasticsearch7-e2e-tests
 
 
+   
+ 
--add-opens=java.base/java.util=ALL-UNNAMED 
+   
+


run-end-to-end-tests
@@ -63,6 +68,7 @@ under the License.
  

(flink) 02/05: [FLINK-32570][core] Introduces Duration replacements for org.apache.flink.api.common.time.Time-related APIs and deprecates the corresponding methods/classes/constructors

2024-01-08 Thread mapohl
This is an automated email from the ASF dual-hosted git repository.

mapohl pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit c943ab49c4aa58286111e2b95e9580a16f4d6b4c
Author: Matthias Pohl 
AuthorDate: Thu Jan 4 16:20:26 2024 +0100

[FLINK-32570][core] Introduces Duration replacements for 
org.apache.flink.api.common.time.Time-related APIs and deprecates the 
corresponding methods/classes/constructors
---
 .../common/restartstrategy/RestartStrategies.java  | 146 +++--
 .../flink/api/common/state/StateTtlConfig.java |  37 +-
 .../org/apache/flink/api/common/time/Time.java |  14 ++
 .../main/java/org/apache/flink/util/TimeUtils.java |   2 +
 .../RestartBackoffTimeStrategyFactoryLoader.java   |  12 +-
 ...estartBackoffTimeStrategyFactoryLoaderTest.java |  10 +-
 .../environment/StreamExecutionEnvironment.java|   5 +-
 .../flink/streaming/api/RestartStrategyTest.java   |   4 +-
 8 files changed, 200 insertions(+), 30 deletions(-)

diff --git 
a/flink-core/src/main/java/org/apache/flink/api/common/restartstrategy/RestartStrategies.java
 
b/flink-core/src/main/java/org/apache/flink/api/common/restartstrategy/RestartStrategies.java
index 10363d77f65..9745e9ad1c1 100644
--- 
a/flink-core/src/main/java/org/apache/flink/api/common/restartstrategy/RestartStrategies.java
+++ 
b/flink-core/src/main/java/org/apache/flink/api/common/restartstrategy/RestartStrategies.java
@@ -90,9 +90,23 @@ public class RestartStrategies {
  * @param restartAttempts Number of restart attempts for the 
FixedDelayRestartStrategy
  * @param delayInterval Delay in-between restart attempts for the 
FixedDelayRestartStrategy
  * @return FixedDelayRestartStrategy
+ * @deprecated Use {@link #fixedDelayRestart(int, Duration)}
  */
+@Deprecated
 public static RestartStrategyConfiguration fixedDelayRestart(
 int restartAttempts, Time delayInterval) {
+return fixedDelayRestart(restartAttempts, 
Time.toDuration(delayInterval));
+}
+
+/**
+ * Generates a FixedDelayRestartStrategyConfiguration.
+ *
+ * @param restartAttempts Number of restart attempts for the 
FixedDelayRestartStrategy
+ * @param delayInterval Delay in-between restart attempts for the 
FixedDelayRestartStrategy
+ * @return FixedDelayRestartStrategy
+ */
+public static RestartStrategyConfiguration fixedDelayRestart(
+int restartAttempts, Duration delayInterval) {
 return new FixedDelayRestartStrategyConfiguration(restartAttempts, 
delayInterval);
 }
 
@@ -103,9 +117,25 @@ public class RestartStrategies {
  * before failing a job
  * @param failureInterval Time interval for failures
  * @param delayInterval Delay in-between restart attempts
+ * @deprecated Use {@link #failureRateRestart(int, Duration, Duration)}
  */
+@Deprecated
 public static FailureRateRestartStrategyConfiguration failureRateRestart(
 int failureRate, Time failureInterval, Time delayInterval) {
+return failureRateRestart(
+failureRate, Time.toDuration(failureInterval), 
Time.toDuration(delayInterval));
+}
+
+/**
+ * Generates a FailureRateRestartStrategyConfiguration.
+ *
+ * @param failureRate Maximum number of restarts in given interval {@code 
failureInterval}
+ * before failing a job
+ * @param failureInterval Time interval for failures
+ * @param delayInterval Delay in-between restart attempts
+ */
+public static FailureRateRestartStrategyConfiguration failureRateRestart(
+int failureRate, Duration failureInterval, Duration delayInterval) 
{
 return new FailureRateRestartStrategyConfiguration(
 failureRate, failureInterval, delayInterval);
 }
@@ -118,13 +148,39 @@ public class RestartStrategies {
  * @param backoffMultiplier Delay multiplier how many times is the delay 
longer than before
  * @param resetBackoffThreshold How long the job must run smoothly to 
reset the time interval
  * @param jitterFactor How much the delay may differ (in percentage)
+ * @deprecated Use {@link #exponentialDelayRestart(Duration, Duration, 
double, Duration,
+ * double)}
  */
+@Deprecated
 public static ExponentialDelayRestartStrategyConfiguration 
exponentialDelayRestart(
 Time initialBackoff,
 Time maxBackoff,
 double backoffMultiplier,
 Time resetBackoffThreshold,
 double jitterFactor) {
+return exponentialDelayRestart(
+Time.toDuration(initialBackoff),
+Time.toDuration(maxBackoff),
+backoffMultiplier,
+Time.toDuration(resetBackoffThreshold),
+jitterFactor);
+}
+
+/**
+ * Generates a ExponentialDelayRestartStrategyConfiguration.
+ *
+ * @param initialBackoff Starting duration between 

(flink) 05/05: [hotfix][api] Adds @PublicEvolving to StateTtlConfig inner classes to remove ArchUnit exclusions

2024-01-08 Thread mapohl
This is an automated email from the ASF dual-hosted git repository.

mapohl pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit ed79a1fc31237b96768bc45dc44b5e4a72429bf4
Author: Matthias Pohl 
AuthorDate: Thu Jul 13 14:30:32 2023 +0200

[hotfix][api] Adds @PublicEvolving to StateTtlConfig inner classes to 
remove ArchUnit exclusions

Signed-off-by: Matthias Pohl 
---
 .../archunit-violations/5b9eed8a-5fb6-4373-98ac-3be2a71941b8   | 5 -
 .../archunit-violations/7602816f-5c01-4b7a-9e3e-235dfedec245   | 7 ---
 .../java/org/apache/flink/api/common/state/StateTtlConfig.java | 7 +++
 3 files changed, 7 insertions(+), 12 deletions(-)

diff --git 
a/flink-architecture-tests/flink-architecture-tests-production/archunit-violations/5b9eed8a-5fb6-4373-98ac-3be2a71941b8
 
b/flink-architecture-tests/flink-architecture-tests-production/archunit-violations/5b9eed8a-5fb6-4373-98ac-3be2a71941b8
index 6079bb488a7..8fa69ca0639 100644
--- 
a/flink-architecture-tests/flink-architecture-tests-production/archunit-violations/5b9eed8a-5fb6-4373-98ac-3be2a71941b8
+++ 
b/flink-architecture-tests/flink-architecture-tests-production/archunit-violations/5b9eed8a-5fb6-4373-98ac-3be2a71941b8
@@ -9,11 +9,6 @@ 
org.apache.flink.api.common.state.ListStateDescriptor.getType(): Returned leaf t
 org.apache.flink.api.common.state.MapStateDescriptor.getType(): Returned leaf 
type org.apache.flink.api.common.state.StateDescriptor$Type does not satisfy: 
reside outside of package 'org.apache.flink..' or reside in any package 
['..shaded..'] or annotated with @Public or annotated with @PublicEvolving or 
annotated with @Deprecated
 org.apache.flink.api.common.state.ReducingStateDescriptor.getType(): Returned 
leaf type org.apache.flink.api.common.state.StateDescriptor$Type does not 
satisfy: reside outside of package 'org.apache.flink..' or reside in any 
package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotated with @Deprecated
 org.apache.flink.api.common.state.StateDescriptor.getType(): Returned leaf 
type org.apache.flink.api.common.state.StateDescriptor$Type does not satisfy: 
reside outside of package 'org.apache.flink..' or reside in any package 
['..shaded..'] or annotated with @Public or annotated with @PublicEvolving or 
annotated with @Deprecated
-org.apache.flink.api.common.state.StateTtlConfig.getCleanupStrategies(): 
Returned leaf type 
org.apache.flink.api.common.state.StateTtlConfig$CleanupStrategies does not 
satisfy: reside outside of package 'org.apache.flink..' or reside in any 
package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotated with @Deprecated
-org.apache.flink.api.common.state.StateTtlConfig.getStateVisibility(): 
Returned leaf type 
org.apache.flink.api.common.state.StateTtlConfig$StateVisibility does not 
satisfy: reside outside of package 'org.apache.flink..' or reside in any 
package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotated with @Deprecated
-org.apache.flink.api.common.state.StateTtlConfig.getTtlTimeCharacteristic(): 
Returned leaf type 
org.apache.flink.api.common.state.StateTtlConfig$TtlTimeCharacteristic does not 
satisfy: reside outside of package 'org.apache.flink..' or reside in any 
package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotated with @Deprecated
-org.apache.flink.api.common.state.StateTtlConfig.getUpdateType(): Returned 
leaf type org.apache.flink.api.common.state.StateTtlConfig$UpdateType does not 
satisfy: reside outside of package 'org.apache.flink..' or reside in any 
package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotated with @Deprecated
-org.apache.flink.api.common.state.StateTtlConfig.newBuilder(org.apache.flink.api.common.time.Time):
 Returned leaf type org.apache.flink.api.common.state.StateTtlConfig$Builder 
does not satisfy: reside outside of package 'org.apache.flink..' or reside in 
any package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotated with @Deprecated
 org.apache.flink.api.common.state.ValueStateDescriptor.getType(): Returned 
leaf type org.apache.flink.api.common.state.StateDescriptor$Type does not 
satisfy: reside outside of package 'org.apache.flink..' or reside in any 
package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotated with @Deprecated
 
org.apache.flink.api.common.typeinfo.PrimitiveArrayTypeInfo.createComparator(boolean,
 org.apache.flink.api.common.ExecutionConfig): Returned leaf type 
org.apache.flink.api.common.typeutils.base.array.PrimitiveArrayComparator does 
not satisfy: reside outside of package 'org.apache.flink..' or reside in any 
package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotated with @Deprecated
 

(flink) 03/05: [hotfix][api] Adds missing @PublicEvolving to WindowAssigner-related classes to remove ArchUnit exclusions

2024-01-08 Thread mapohl
This is an automated email from the ASF dual-hosted git repository.

mapohl pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit ce5e03750a710417e0ee5fb734d5ac7399a13668
Author: Matthias Pohl 
AuthorDate: Thu Jul 13 14:53:24 2023 +0200

[hotfix][api] Adds missing @PublicEvolving to WindowAssigner-related 
classes to remove ArchUnit exclusions

Signed-off-by: Matthias Pohl 
---
 .../archunit-violations/5b9eed8a-5fb6-4373-98ac-3be2a71941b8  | 11 ---
 .../archunit-violations/7602816f-5c01-4b7a-9e3e-235dfedec245  |  6 --
 .../api/windowing/assigners/EventTimeSessionWindows.java  |  1 +
 .../api/windowing/assigners/MergingWindowAssigner.java|  1 +
 .../api/windowing/assigners/ProcessingTimeSessionWindows.java |  1 +
 .../api/windowing/assigners/SlidingProcessingTimeWindows.java |  2 ++
 .../windowing/assigners/TumblingProcessingTimeWindows.java|  1 +
 .../streaming/api/windowing/assigners/WindowAssigner.java |  1 +
 8 files changed, 7 insertions(+), 17 deletions(-)

diff --git 
a/flink-architecture-tests/flink-architecture-tests-production/archunit-violations/5b9eed8a-5fb6-4373-98ac-3be2a71941b8
 
b/flink-architecture-tests/flink-architecture-tests-production/archunit-violations/5b9eed8a-5fb6-4373-98ac-3be2a71941b8
index 7510d7d0190..653c70bef5c 100644
--- 
a/flink-architecture-tests/flink-architecture-tests-production/archunit-violations/5b9eed8a-5fb6-4373-98ac-3be2a71941b8
+++ 
b/flink-architecture-tests/flink-architecture-tests-production/archunit-violations/5b9eed8a-5fb6-4373-98ac-3be2a71941b8
@@ -255,16 +255,6 @@ 
org.apache.flink.streaming.api.operators.TwoInputStreamOperator.processElement1(
 
org.apache.flink.streaming.api.operators.TwoInputStreamOperator.processElement2(org.apache.flink.streaming.runtime.streamrecord.StreamRecord):
 Argument leaf type 
org.apache.flink.streaming.runtime.streamrecord.StreamRecord does not satisfy: 
reside outside of package 'org.apache.flink..' or reside in any package 
['..shaded..'] or annotated with @Public or annotated with @PublicEvolving or 
annotated with @Deprecated
 
org.apache.flink.streaming.api.operators.TwoInputStreamOperator.processWatermarkStatus1(org.apache.flink.streaming.runtime.watermarkstatus.WatermarkStatus):
 Argument leaf type 
org.apache.flink.streaming.runtime.watermarkstatus.WatermarkStatus does not 
satisfy: reside outside of package 'org.apache.flink..' or reside in any 
package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotated with @Deprecated
 
org.apache.flink.streaming.api.operators.TwoInputStreamOperator.processWatermarkStatus2(org.apache.flink.streaming.runtime.watermarkstatus.WatermarkStatus):
 Argument leaf type 
org.apache.flink.streaming.runtime.watermarkstatus.WatermarkStatus does not 
satisfy: reside outside of package 'org.apache.flink..' or reside in any 
package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotated with @Deprecated
-org.apache.flink.streaming.api.windowing.assigners.DynamicEventTimeSessionWindows.assignWindows(java.lang.Object,
 long, 
org.apache.flink.streaming.api.windowing.assigners.WindowAssigner$WindowAssignerContext):
 Argument leaf type 
org.apache.flink.streaming.api.windowing.assigners.WindowAssigner$WindowAssignerContext
 does not satisfy: reside outside of package 'org.apache.flink..' or reside in 
any package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotat [...]
-org.apache.flink.streaming.api.windowing.assigners.DynamicEventTimeSessionWindows.mergeWindows(java.util.Collection,
 
org.apache.flink.streaming.api.windowing.assigners.MergingWindowAssigner$MergeCallback):
 Argument leaf type 
org.apache.flink.streaming.api.windowing.assigners.MergingWindowAssigner$MergeCallback
 does not satisfy: reside outside of package 'org.apache.flink..' or reside in 
any package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotated wi [...]
-org.apache.flink.streaming.api.windowing.assigners.DynamicProcessingTimeSessionWindows.assignWindows(java.lang.Object,
 long, 
org.apache.flink.streaming.api.windowing.assigners.WindowAssigner$WindowAssignerContext):
 Argument leaf type 
org.apache.flink.streaming.api.windowing.assigners.WindowAssigner$WindowAssignerContext
 does not satisfy: reside outside of package 'org.apache.flink..' or reside in 
any package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or an [...]
-org.apache.flink.streaming.api.windowing.assigners.DynamicProcessingTimeSessionWindows.mergeWindows(java.util.Collection,
 
org.apache.flink.streaming.api.windowing.assigners.MergingWindowAssigner$MergeCallback):
 Argument leaf type 
org.apache.flink.streaming.api.windowing.assigners.MergingWindowAssigner$MergeCallback
 does not satisfy: reside outside of package 'org.apache.flink..' or reside in 
any package ['..shaded..'] or annotated with 

(flink) branch master updated (639deeca337 -> ed79a1fc312)

2024-01-08 Thread mapohl
This is an automated email from the ASF dual-hosted git repository.

mapohl pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


from 639deeca337 [FLINK-34012][python] Fix mypy error (#24043)
 new 6f0d07633a5 [FLINK-32570][streaming] Deprecates 
org.apache.flink.streaming.api.windowing.time.Time-related APIs in favor of 
Duration
 new c943ab49c4a [FLINK-32570][core] Introduces Duration replacements for 
org.apache.flink.api.common.time.Time-related APIs and deprecates the 
corresponding methods/classes/constructors
 new ce5e03750a7 [hotfix][api] Adds missing @PublicEvolving to 
WindowAssigner-related classes to remove ArchUnit exclusions
 new 5bbf973df19 [hotfix][api] Adds @PublicEvolving to RestartStrategies' 
inner classes to remove ArchUnit exclusions
 new ed79a1fc312 [hotfix][api] Adds @PublicEvolving to StateTtlConfig inner 
classes to remove ArchUnit exclusions

The 5 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../5b9eed8a-5fb6-4373-98ac-3be2a71941b8   |  25 
 .../7602816f-5c01-4b7a-9e3e-235dfedec245   |  19 ---
 .../common/restartstrategy/RestartStrategies.java  | 152 +++--
 .../flink/api/common/state/StateTtlConfig.java |  44 +-
 .../org/apache/flink/api/common/time/Time.java |  14 ++
 .../main/java/org/apache/flink/util/TimeUtils.java |   2 +
 .../flink/streaming/examples/join/WindowJoin.java  |   3 +-
 .../examples/sideoutput/SideOutputExample.java |   3 +-
 .../examples/socket/SocketWindowWordCount.java |   5 +-
 .../GroupedProcessingTimeWindowExample.java|   5 +-
 .../examples/windowing/SessionWindowing.java   |   3 +-
 .../examples/windowing/TopSpeedWindowing.java  |   4 +-
 .../apache/flink/cep/scala/pattern/Pattern.scala   |  20 ++-
 .../flink/cep/scala/pattern/PatternTest.scala  |   6 +-
 .../cep/functions/TimedOutPartialMatchHandler.java |   6 +-
 .../apache/flink/cep/nfa/compiler/NFACompiler.java |  43 +++---
 .../java/org/apache/flink/cep/pattern/Pattern.java | 131 --
 .../org/apache/flink/cep/pattern/Quantifier.java   |  29 +++-
 .../apache/flink/cep/operator/CEPOperatorTest.java |   8 +-
 .../operator/CepProcessFunctionContextTest.java|   4 +-
 .../RestartBackoffTimeStrategyFactoryLoader.java   |  12 +-
 ...estartBackoffTimeStrategyFactoryLoaderTest.java |  10 +-
 .../api/datastream/AllWindowedStream.java  |  21 ++-
 .../streaming/api/datastream/CoGroupedStreams.java |  63 -
 .../streaming/api/datastream/JoinedStreams.java|  66 -
 .../streaming/api/datastream/KeyedStream.java  |  31 -
 .../streaming/api/datastream/WindowedStream.java   |  19 ++-
 .../environment/StreamExecutionEnvironment.java|  15 +-
 .../BoundedOutOfOrdernessTimestampExtractor.java   |  25 +++-
 .../assigners/EventTimeSessionWindows.java |  19 ++-
 .../windowing/assigners/MergingWindowAssigner.java |   1 +
 .../assigners/ProcessingTimeSessionWindows.java|  19 ++-
 .../assigners/SlidingEventTimeWindows.java |  47 ++-
 .../assigners/SlidingProcessingTimeWindows.java|  46 ++-
 .../assigners/TumblingEventTimeWindows.java|  63 -
 .../assigners/TumblingProcessingTimeWindows.java   |  66 -
 .../api/windowing/assigners/WindowAssigner.java|   1 +
 .../api/windowing/evictors/TimeEvictor.java|  30 +++-
 .../flink/streaming/api/windowing/time/Time.java   |  20 +++
 .../triggers/ContinuousEventTimeTrigger.java   |  16 ++-
 .../triggers/ContinuousProcessingTimeTrigger.java  |  16 ++-
 .../operators/windowing/WindowOperatorBuilder.java |   9 +-
 .../flink/streaming/api/RestartStrategyTest.java   |   4 +-
 .../api/datastream/CoGroupedStreamsTest.java   |  14 +-
 .../api/datastream/JoinedStreamsTest.java  |  19 +--
 .../planner/match/PatternTranslatorTestBase.scala  |   6 +-
 46 files changed, 974 insertions(+), 210 deletions(-)



(flink) 04/05: [hotfix][api] Adds @PublicEvolving to RestartStrategies' inner classes to remove ArchUnit exclusions

2024-01-08 Thread mapohl
This is an automated email from the ASF dual-hosted git repository.

mapohl pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 5bbf973df19783d1dd9fde616cba1638b050b11d
Author: Matthias Pohl 
AuthorDate: Thu Jul 13 14:36:22 2023 +0200

[hotfix][api] Adds @PublicEvolving to RestartStrategies' inner classes to 
remove ArchUnit exclusions

Signed-off-by: Matthias Pohl 
---
 .../archunit-violations/5b9eed8a-5fb6-4373-98ac-3be2a71941b8 | 9 -
 .../archunit-violations/7602816f-5c01-4b7a-9e3e-235dfedec245 | 6 --
 .../flink/api/common/restartstrategy/RestartStrategies.java  | 6 ++
 3 files changed, 6 insertions(+), 15 deletions(-)

diff --git 
a/flink-architecture-tests/flink-architecture-tests-production/archunit-violations/5b9eed8a-5fb6-4373-98ac-3be2a71941b8
 
b/flink-architecture-tests/flink-architecture-tests-production/archunit-violations/5b9eed8a-5fb6-4373-98ac-3be2a71941b8
index 653c70bef5c..6079bb488a7 100644
--- 
a/flink-architecture-tests/flink-architecture-tests-production/archunit-violations/5b9eed8a-5fb6-4373-98ac-3be2a71941b8
+++ 
b/flink-architecture-tests/flink-architecture-tests-production/archunit-violations/5b9eed8a-5fb6-4373-98ac-3be2a71941b8
@@ -4,13 +4,6 @@ 
org.apache.flink.api.common.operators.Operator.getMinResources(): Returned leaf
 org.apache.flink.api.common.operators.Operator.getPreferredResources(): 
Returned leaf type org.apache.flink.api.common.operators.ResourceSpec does not 
satisfy: reside outside of package 'org.apache.flink..' or reside in any 
package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotated with @Deprecated
 
org.apache.flink.api.common.operators.Operator.setResources(org.apache.flink.api.common.operators.ResourceSpec,
 org.apache.flink.api.common.operators.ResourceSpec): Argument leaf type 
org.apache.flink.api.common.operators.ResourceSpec does not satisfy: reside 
outside of package 'org.apache.flink..' or reside in any package ['..shaded..'] 
or annotated with @Public or annotated with @PublicEvolving or annotated with 
@Deprecated
 
org.apache.flink.api.common.operators.SlotSharingGroup.newBuilder(java.lang.String):
 Returned leaf type 
org.apache.flink.api.common.operators.SlotSharingGroup$Builder does not 
satisfy: reside outside of package 'org.apache.flink..' or reside in any 
package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotated with @Deprecated
-org.apache.flink.api.common.restartstrategy.RestartStrategies.exponentialDelayRestart(org.apache.flink.api.common.time.Time,
 org.apache.flink.api.common.time.Time, double, 
org.apache.flink.api.common.time.Time, double): Returned leaf type 
org.apache.flink.api.common.restartstrategy.RestartStrategies$ExponentialDelayRestartStrategyConfiguration
 does not satisfy: reside outside of package 'org.apache.flink..' or reside in 
any package ['..shaded..'] or annotated with @Public or annotated wi [...]
-org.apache.flink.api.common.restartstrategy.RestartStrategies.failureRateRestart(int,
 org.apache.flink.api.common.time.Time, org.apache.flink.api.common.time.Time): 
Returned leaf type 
org.apache.flink.api.common.restartstrategy.RestartStrategies$FailureRateRestartStrategyConfiguration
 does not satisfy: reside outside of package 'org.apache.flink..' or reside in 
any package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotated with @Deprecated
-org.apache.flink.api.common.restartstrategy.RestartStrategies.fallBackRestart():
 Returned leaf type 
org.apache.flink.api.common.restartstrategy.RestartStrategies$RestartStrategyConfiguration
 does not satisfy: reside outside of package 'org.apache.flink..' or reside in 
any package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotated with @Deprecated
-org.apache.flink.api.common.restartstrategy.RestartStrategies.fixedDelayRestart(int,
 long): Returned leaf type 
org.apache.flink.api.common.restartstrategy.RestartStrategies$RestartStrategyConfiguration
 does not satisfy: reside outside of package 'org.apache.flink..' or reside in 
any package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotated with @Deprecated
-org.apache.flink.api.common.restartstrategy.RestartStrategies.fixedDelayRestart(int,
 org.apache.flink.api.common.time.Time): Returned leaf type 
org.apache.flink.api.common.restartstrategy.RestartStrategies$RestartStrategyConfiguration
 does not satisfy: reside outside of package 'org.apache.flink..' or reside in 
any package ['..shaded..'] or annotated with @Public or annotated with 
@PublicEvolving or annotated with @Deprecated
-org.apache.flink.api.common.restartstrategy.RestartStrategies.fromConfiguration(org.apache.flink.configuration.ReadableConfig):
 Returned leaf type 
org.apache.flink.api.common.restartstrategy.RestartStrategies$RestartStrategyConfiguration
 does not satisfy: reside 

(flink) 01/05: [FLINK-32570][streaming] Deprecates org.apache.flink.streaming.api.windowing.time.Time-related APIs in favor of Duration

2024-01-08 Thread mapohl
This is an automated email from the ASF dual-hosted git repository.

mapohl pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 6f0d07633a5c8e6511f3d16e04561cb277b65407
Author: Matthias Pohl 
AuthorDate: Mon Jul 10 19:11:57 2023 +0200

[FLINK-32570][streaming] Deprecates 
org.apache.flink.streaming.api.windowing.time.Time-related APIs in favor of 
Duration

Signed-off-by: Matthias Pohl 
---
 .../flink/streaming/examples/join/WindowJoin.java  |   3 +-
 .../examples/sideoutput/SideOutputExample.java |   3 +-
 .../examples/socket/SocketWindowWordCount.java |   5 +-
 .../GroupedProcessingTimeWindowExample.java|   5 +-
 .../examples/windowing/SessionWindowing.java   |   3 +-
 .../examples/windowing/TopSpeedWindowing.java  |   4 +-
 .../apache/flink/cep/scala/pattern/Pattern.scala   |  20 +++-
 .../flink/cep/scala/pattern/PatternTest.scala  |   6 +-
 .../cep/functions/TimedOutPartialMatchHandler.java |   6 +-
 .../apache/flink/cep/nfa/compiler/NFACompiler.java |  43 ---
 .../java/org/apache/flink/cep/pattern/Pattern.java | 131 +++--
 .../org/apache/flink/cep/pattern/Quantifier.java   |  29 -
 .../apache/flink/cep/operator/CEPOperatorTest.java |   8 +-
 .../operator/CepProcessFunctionContextTest.java|   4 +-
 .../api/datastream/AllWindowedStream.java  |  21 +++-
 .../streaming/api/datastream/CoGroupedStreams.java |  63 +-
 .../streaming/api/datastream/JoinedStreams.java|  66 +--
 .../streaming/api/datastream/KeyedStream.java  |  31 +++--
 .../streaming/api/datastream/WindowedStream.java   |  19 ++-
 .../environment/StreamExecutionEnvironment.java|  10 +-
 .../BoundedOutOfOrdernessTimestampExtractor.java   |  25 ++--
 .../assigners/EventTimeSessionWindows.java |  18 ++-
 .../assigners/ProcessingTimeSessionWindows.java|  18 ++-
 .../assigners/SlidingEventTimeWindows.java |  47 +++-
 .../assigners/SlidingProcessingTimeWindows.java|  44 ++-
 .../assigners/TumblingEventTimeWindows.java|  63 +-
 .../assigners/TumblingProcessingTimeWindows.java   |  65 +-
 .../api/windowing/evictors/TimeEvictor.java|  30 -
 .../flink/streaming/api/windowing/time/Time.java   |  20 
 .../triggers/ContinuousEventTimeTrigger.java   |  16 ++-
 .../triggers/ContinuousProcessingTimeTrigger.java  |  16 ++-
 .../operators/windowing/WindowOperatorBuilder.java |   9 +-
 .../api/datastream/CoGroupedStreamsTest.java   |  14 +--
 .../api/datastream/JoinedStreamsTest.java  |  19 +--
 .../planner/match/PatternTranslatorTestBase.scala  |   6 +-
 35 files changed, 754 insertions(+), 136 deletions(-)

diff --git 
a/flink-examples/flink-examples-streaming/src/main/java/org/apache/flink/streaming/examples/join/WindowJoin.java
 
b/flink-examples/flink-examples-streaming/src/main/java/org/apache/flink/streaming/examples/join/WindowJoin.java
index 3b6d6c2bd89..abd44392a60 100644
--- 
a/flink-examples/flink-examples-streaming/src/main/java/org/apache/flink/streaming/examples/join/WindowJoin.java
+++ 
b/flink-examples/flink-examples-streaming/src/main/java/org/apache/flink/streaming/examples/join/WindowJoin.java
@@ -36,7 +36,6 @@ import org.apache.flink.streaming.api.datastream.DataStream;
 import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
 import 
org.apache.flink.streaming.api.functions.sink.filesystem.rollingpolicies.DefaultRollingPolicy;
 import 
org.apache.flink.streaming.api.windowing.assigners.TumblingEventTimeWindows;
-import org.apache.flink.streaming.api.windowing.time.Time;
 
 import java.time.Duration;
 
@@ -122,7 +121,7 @@ public class WindowJoin {
 return grades.join(salaries)
 .where(new NameKeySelector())
 .equalTo(new NameKeySelector())
-
.window(TumblingEventTimeWindows.of(Time.milliseconds(windowSize)))
+
.window(TumblingEventTimeWindows.of(Duration.ofMillis(windowSize)))
 .apply(
 new JoinFunction<
 Tuple2,
diff --git 
a/flink-examples/flink-examples-streaming/src/main/java/org/apache/flink/streaming/examples/sideoutput/SideOutputExample.java
 
b/flink-examples/flink-examples-streaming/src/main/java/org/apache/flink/streaming/examples/sideoutput/SideOutputExample.java
index 13dc391e817..6e33f5d9ffc 100644
--- 
a/flink-examples/flink-examples-streaming/src/main/java/org/apache/flink/streaming/examples/sideoutput/SideOutputExample.java
+++ 
b/flink-examples/flink-examples-streaming/src/main/java/org/apache/flink/streaming/examples/sideoutput/SideOutputExample.java
@@ -39,7 +39,6 @@ import 
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
 import org.apache.flink.streaming.api.functions.ProcessFunction;
 import 
org.apache.flink.streaming.api.functions.sink.filesystem.rollingpolicies.DefaultRollingPolicy;
 import 

(flink-connector-rabbitmq) branch v3.0 updated: [FLINK-34019] Bump com.rabbitmq:amqp-client from 5.13.1 to 5.20.0. This closes #18

2024-01-08 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch v3.0
in repository https://gitbox.apache.org/repos/asf/flink-connector-rabbitmq.git


The following commit(s) were added to refs/heads/v3.0 by this push:
 new 881c83b  [FLINK-34019] Bump com.rabbitmq:amqp-client from 5.13.1 to 
5.20.0. This closes #18
881c83b is described below

commit 881c83b5daf3a4e48a8027f85dff1f1f99dc6f81
Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
AuthorDate: Mon Jan 8 11:05:28 2024 +0100

[FLINK-34019] Bump com.rabbitmq:amqp-client from 5.13.1 to 5.20.0. This 
closes #18

* Bump com.rabbitmq:amqp-client in /flink-connector-rabbitmq

Bumps 
[com.rabbitmq:amqp-client](https://github.com/rabbitmq/rabbitmq-java-client) 
from 5.13.1 to 5.18.0.
- [Release notes](https://github.com/rabbitmq/rabbitmq-java-client/releases)
- 
[Commits](https://github.com/rabbitmq/rabbitmq-java-client/compare/v5.13.1...v5.18.0)

---
updated-dependencies:
- dependency-name: com.rabbitmq:amqp-client
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] 

* Bump com.rabbitmq:amqp-client in /flink-connector-rabbitmq

-

Signed-off-by: dependabot[bot] 
Co-authored-by: dependabot[bot] 
<49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Martijn Visser 
<2989614+martijnvis...@users.noreply.github.com>
(cherry picked from commit 427bf4cd9fbf27d338a33250257489491952aeb8)
---
 flink-connector-rabbitmq/pom.xml| 2 +-
 flink-sql-connector-rabbitmq/src/main/resources/META-INF/NOTICE | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/flink-connector-rabbitmq/pom.xml b/flink-connector-rabbitmq/pom.xml
index b2f5dcf..a8fa213 100644
--- a/flink-connector-rabbitmq/pom.xml
+++ b/flink-connector-rabbitmq/pom.xml
@@ -36,7 +36,7 @@ under the License.
 


-   5.13.1
+   5.20.0

 

diff --git a/flink-sql-connector-rabbitmq/src/main/resources/META-INF/NOTICE 
b/flink-sql-connector-rabbitmq/src/main/resources/META-INF/NOTICE
index 9cdf920..e4f475f 100644
--- a/flink-sql-connector-rabbitmq/src/main/resources/META-INF/NOTICE
+++ b/flink-sql-connector-rabbitmq/src/main/resources/META-INF/NOTICE
@@ -6,4 +6,4 @@ The Apache Software Foundation (http://www.apache.org/).
 
 This project bundles the following dependencies under the Apache Software 
License 2.0 (http://www.apache.org/licenses/LICENSE-2.0.txt)
 
-- com.rabbitmq:amqp-client:5.13.1
+- com.rabbitmq:amqp-client:5.20.0



(flink-connector-rabbitmq) branch dependabot/maven/flink-connector-rabbitmq/com.rabbitmq-amqp-client-5.18.0 deleted (was b565135)

2024-01-08 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch 
dependabot/maven/flink-connector-rabbitmq/com.rabbitmq-amqp-client-5.18.0
in repository https://gitbox.apache.org/repos/asf/flink-connector-rabbitmq.git


 was b565135  Bump com.rabbitmq:amqp-client in /flink-connector-rabbitmq

The revisions that were on this branch are still contained in
other references; therefore, this change does not discard any commits
from the repository.



(flink-connector-rabbitmq) branch main updated: [FLINK-34019] Bump com.rabbitmq:amqp-client from 5.13.1 to 5.20.0. This closes #18

2024-01-08 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-rabbitmq.git


The following commit(s) were added to refs/heads/main by this push:
 new 427bf4c  [FLINK-34019] Bump com.rabbitmq:amqp-client from 5.13.1 to 
5.20.0. This closes #18
427bf4c is described below

commit 427bf4cd9fbf27d338a33250257489491952aeb8
Author: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
AuthorDate: Mon Jan 8 11:05:28 2024 +0100

[FLINK-34019] Bump com.rabbitmq:amqp-client from 5.13.1 to 5.20.0. This 
closes #18

* Bump com.rabbitmq:amqp-client in /flink-connector-rabbitmq

Bumps 
[com.rabbitmq:amqp-client](https://github.com/rabbitmq/rabbitmq-java-client) 
from 5.13.1 to 5.18.0.
- [Release notes](https://github.com/rabbitmq/rabbitmq-java-client/releases)
- 
[Commits](https://github.com/rabbitmq/rabbitmq-java-client/compare/v5.13.1...v5.18.0)

---
updated-dependencies:
- dependency-name: com.rabbitmq:amqp-client
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] 

* Bump com.rabbitmq:amqp-client in /flink-connector-rabbitmq

-

Signed-off-by: dependabot[bot] 
Co-authored-by: dependabot[bot] 
<49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Martijn Visser 
<2989614+martijnvis...@users.noreply.github.com>
---
 flink-connector-rabbitmq/pom.xml| 2 +-
 flink-sql-connector-rabbitmq/src/main/resources/META-INF/NOTICE | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/flink-connector-rabbitmq/pom.xml b/flink-connector-rabbitmq/pom.xml
index b2f5dcf..a8fa213 100644
--- a/flink-connector-rabbitmq/pom.xml
+++ b/flink-connector-rabbitmq/pom.xml
@@ -36,7 +36,7 @@ under the License.
 


-   5.13.1
+   5.20.0

 

diff --git a/flink-sql-connector-rabbitmq/src/main/resources/META-INF/NOTICE 
b/flink-sql-connector-rabbitmq/src/main/resources/META-INF/NOTICE
index 9cdf920..e4f475f 100644
--- a/flink-sql-connector-rabbitmq/src/main/resources/META-INF/NOTICE
+++ b/flink-sql-connector-rabbitmq/src/main/resources/META-INF/NOTICE
@@ -6,4 +6,4 @@ The Apache Software Foundation (http://www.apache.org/).
 
 This project bundles the following dependencies under the Apache Software 
License 2.0 (http://www.apache.org/licenses/LICENSE-2.0.txt)
 
-- com.rabbitmq:amqp-client:5.13.1
+- com.rabbitmq:amqp-client:5.20.0



(flink-connector-rabbitmq) branch dependabot/maven/flink-connector-rabbitmq/com.rabbitmq-amqp-client-5.18.0 updated (727fe27 -> b565135)

2024-01-08 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch 
dependabot/maven/flink-connector-rabbitmq/com.rabbitmq-amqp-client-5.18.0
in repository https://gitbox.apache.org/repos/asf/flink-connector-rabbitmq.git


from 727fe27  Bump com.rabbitmq:amqp-client in /flink-connector-rabbitmq
 add b565135  Bump com.rabbitmq:amqp-client in /flink-connector-rabbitmq

No new revisions were added by this update.

Summary of changes:
 flink-connector-rabbitmq/pom.xml| 2 +-
 flink-sql-connector-rabbitmq/src/main/resources/META-INF/NOTICE | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)



(flink-connector-rabbitmq) branch v3.0 updated: [FLINK-26907] RMQSourceITCase failed on Azure due to container startup failed. This closes #20

2024-01-08 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch v3.0
in repository https://gitbox.apache.org/repos/asf/flink-connector-rabbitmq.git


The following commit(s) were added to refs/heads/v3.0 by this push:
 new 8672727  [FLINK-26907] RMQSourceITCase failed on Azure due to 
container startup failed. This closes #20
8672727 is described below

commit 8672727f3643f19c951afcf20369ccf06aea9b56
Author: yuxiang <384669...@qq.com>
AuthorDate: Mon Jan 8 17:00:24 2024 +0800

[FLINK-26907] RMQSourceITCase failed on Azure due to container startup 
failed. This closes #20

Co-authored-by: yu <13485876233>
(cherry picked from commit 7203c951987412c669e9b7ca75d5a8584b2b8c80)
---
 .../apache/flink/streaming/connectors/rabbitmq/RMQSourceITCase.java| 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git 
a/flink-connector-rabbitmq/src/test/java/org/apache/flink/streaming/connectors/rabbitmq/RMQSourceITCase.java
 
b/flink-connector-rabbitmq/src/test/java/org/apache/flink/streaming/connectors/rabbitmq/RMQSourceITCase.java
index a245d54..db3a32e 100644
--- 
a/flink-connector-rabbitmq/src/test/java/org/apache/flink/streaming/connectors/rabbitmq/RMQSourceITCase.java
+++ 
b/flink-connector-rabbitmq/src/test/java/org/apache/flink/streaming/connectors/rabbitmq/RMQSourceITCase.java
@@ -88,7 +88,8 @@ public class RMQSourceITCase {
 public static final RabbitMQContainer RMQ_CONTAINER =
 new 
RabbitMQContainer(DockerImageName.parse(DockerImageVersions.RABBITMQ))
 .withExposedPorts(RABBITMQ_PORT)
-.withLogConsumer(LOG_CONSUMER);
+.withLogConsumer(LOG_CONSUMER)
+.withStartupAttempts(3);
 
 @Before
 public void setUp() throws Exception {



(flink-connector-rabbitmq) branch main updated: [FLINK-26907] RMQSourceITCase failed on Azure due to container startup failed. This closes #20

2024-01-08 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-rabbitmq.git


The following commit(s) were added to refs/heads/main by this push:
 new 7203c95  [FLINK-26907] RMQSourceITCase failed on Azure due to 
container startup failed. This closes #20
7203c95 is described below

commit 7203c951987412c669e9b7ca75d5a8584b2b8c80
Author: yuxiang <384669...@qq.com>
AuthorDate: Mon Jan 8 17:00:24 2024 +0800

[FLINK-26907] RMQSourceITCase failed on Azure due to container startup 
failed. This closes #20

Co-authored-by: yu <13485876233>
---
 .../apache/flink/streaming/connectors/rabbitmq/RMQSourceITCase.java| 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git 
a/flink-connector-rabbitmq/src/test/java/org/apache/flink/streaming/connectors/rabbitmq/RMQSourceITCase.java
 
b/flink-connector-rabbitmq/src/test/java/org/apache/flink/streaming/connectors/rabbitmq/RMQSourceITCase.java
index a245d54..db3a32e 100644
--- 
a/flink-connector-rabbitmq/src/test/java/org/apache/flink/streaming/connectors/rabbitmq/RMQSourceITCase.java
+++ 
b/flink-connector-rabbitmq/src/test/java/org/apache/flink/streaming/connectors/rabbitmq/RMQSourceITCase.java
@@ -88,7 +88,8 @@ public class RMQSourceITCase {
 public static final RabbitMQContainer RMQ_CONTAINER =
 new 
RabbitMQContainer(DockerImageName.parse(DockerImageVersions.RABBITMQ))
 .withExposedPorts(RABBITMQ_PORT)
-.withLogConsumer(LOG_CONSUMER);
+.withLogConsumer(LOG_CONSUMER)
+.withStartupAttempts(3);
 
 @Before
 public void setUp() throws Exception {