[flink] branch release-1.15 updated: Update japicmp configuration for 1.15.0
This is an automated email from the ASF dual-hosted git repository. gaoyunhaii pushed a commit to branch release-1.15 in repository https://gitbox.apache.org/repos/asf/flink.git The following commit(s) were added to refs/heads/release-1.15 by this push: new 616cd77a5ab Update japicmp configuration for 1.15.0 616cd77a5ab is described below commit 616cd77a5ab90ac858cde08f10e763473f259e55 Author: Yun Gao AuthorDate: Thu May 5 14:49:04 2022 +0800 Update japicmp configuration for 1.15.0 --- pom.xml | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/pom.xml b/pom.xml index 711c631e097..edd9a3900d3 100644 --- a/pom.xml +++ b/pom.xml @@ -155,7 +155,7 @@ under the License. For Hadoop 2.7, the minor Hadoop version supported for flink-shaded-hadoop-2-uber is 2.7.5 --> 2.7.5 - 1.14.0 + 1.15.0 tools/japicmp-output 2.4.2 @@ -2129,11 +2129,10 @@ under the License. @org.apache.flink.annotation.Public - + @org.apache.flink.annotation.PublicEvolving @org.apache.flink.annotation.Experimental - @org.apache.flink.annotation.PublicEvolving @org.apache.flink.annotation.Internal org.apache.flink.streaming.api.datastream.DataStream#DataStream(org.apache.flink.streaming.api.environment.StreamExecutionEnvironment,org.apache.flink.streaming.api.transformations.StreamTransformation) org.apache.flink.streaming.api.environment.LegacyLocalStreamEnvironment
[flink] branch master updated: Update japicmp configuration for 1.15.0
This is an automated email from the ASF dual-hosted git repository. gaoyunhaii pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/flink.git The following commit(s) were added to refs/heads/master by this push: new 763289a71e7 Update japicmp configuration for 1.15.0 763289a71e7 is described below commit 763289a71e7e23b93c766a10f1e0c93b3ec4bf44 Author: Yun Gao AuthorDate: Thu May 5 14:47:13 2022 +0800 Update japicmp configuration for 1.15.0 --- pom.xml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/pom.xml b/pom.xml index 3de8112bb15..ee5eaf09794 100644 --- a/pom.xml +++ b/pom.xml @@ -157,7 +157,7 @@ under the License. For Hadoop 2.7, the minor Hadoop version supported for flink-shaded-hadoop-2-uber is 2.7.5 --> 2.7.5 - 1.14.0 + 1.15.0 tools/japicmp-output 2.13.0 3.4.3
[flink-web] 01/02: Announcement blogpost for the 1.15 release
This is an automated email from the ASF dual-hosted git repository. gaoyunhaii pushed a commit to branch asf-site in repository https://gitbox.apache.org/repos/asf/flink-web.git commit cc1df7a781f737ee920863fe5e636b4587ef63d3 Author: Joe Moser AuthorDate: Mon Apr 11 12:50:32 2022 +0200 Announcement blogpost for the 1.15 release --- _config.yml| 27 ++- _posts/2022-05-05-1.15-announcement.md | 427 + q/gradle-quickstart.sh | 2 +- q/quickstart-SNAPSHOT.sh | 2 +- q/quickstart-scala-SNAPSHOT.sh | 2 +- q/quickstart-scala.sh | 2 +- q/quickstart.sh| 2 +- q/sbt-quickstart.sh| 2 +- 8 files changed, 456 insertions(+), 10 deletions(-) diff --git a/_config.yml b/_config.yml index b16fe6879..37f22f4eb 100644 --- a/_config.yml +++ b/_config.yml @@ -9,8 +9,8 @@ url: https://flink.apache.org DOCS_BASE_URL: https://nightlies.apache.org/flink/ -FLINK_VERSION_STABLE: 1.14.4 -FLINK_VERSION_STABLE_SHORT: "1.14" +FLINK_VERSION_STABLE: 1.15.0 +FLINK_VERSION_STABLE_SHORT: "1.15" FLINK_ISSUES_URL: https://issues.apache.org/jira/browse/FLINK FLINK_GITHUB_URL: https://github.com/apache/flink @@ -73,6 +73,21 @@ FLINK_TABLE_STORE_GITHUB_REPO_NAME: flink-table-store # md1_url: https://repo.maven.apache.org/maven2/org/apache/flink/flink-metrics-prometheus_2.12/1.7.1/flink-metrics-prometheus_2.12-1.7.1.jar.sha1 flink_releases: + - version_short: "1.15" +binary_release: + name: "Apache Flink 1.15.0" + scala_212: +id: "1150-download_212" +url: "https://www.apache.org/dyn/closer.lua/flink/flink-1.15.0/flink-1.15.0-bin-scala_2.12.tgz"; +asc_url: "https://downloads.apache.org/flink/flink-1.15.0/flink-1.15.0-bin-scala_2.12.tgz.asc"; +sha512_url: "https://downloads.apache.org/flink/flink-1.15.0/flink-1.15.0-bin-scala_2.12.tgz.sha512"; +source_release: + name: "Apache Flink 1.15.0" + id: "1150-download-source" + url: "https://www.apache.org/dyn/closer.lua/flink/flink-1.15.0/flink-1.15.0-src.tgz"; + asc_url: "https://downloads.apache.org/flink/flink-1.15.0/flink-1.15.0-src.tgz.asc"; + sha512_url: "https://downloads.apache.org/flink/flink-1.15.0/flink-1.15.0-src.tgz.sha512"; +release_notes_url: "https://nightlies.apache.org/flink/flink-docs-release-1.15/release-notes/flink-1.15"; - version_short: "1.14" binary_release: @@ -233,7 +248,11 @@ component_releases: release_archive: flink: - - version_short: "1.14" + - version_short: "1.15" +version_long: 1.15.0 +release_date: 2022-05-05 + - +version_short: "1.14" version_long: 1.14.4 release_date: 2022-03-02 - @@ -709,4 +728,4 @@ timezone: CET # News Posts paginate: 10 -paginate_path: "blog/page:num" +paginate_path: "blog/page:num" \ No newline at end of file diff --git a/_posts/2022-05-05-1.15-announcement.md b/_posts/2022-05-05-1.15-announcement.md new file mode 100644 index 0..e0925bd1b --- /dev/null +++ b/_posts/2022-05-05-1.15-announcement.md @@ -0,0 +1,427 @@ +--- +layout: post +title: "Announcing the Release of Apache Flink 1.15" +subtitle: "" +date: 2022-05-05T08:00:00.000Z +categories: news +authors: +- joemoe: + name: "Joe Moser" + twitter: "JoemoeAT" +- yungao: + name: "Yun Gao" + twitter: "YunGao16" + +--- + +Thanks to our well-organized and open community, Apache Flink continues +[to grow](https://www.apache.org/foundation/docs/FY2021AnnualReport.pdf) as a +technology and remain one of the most active projects in +the Apache community. With the release of Flink 1.15, we are proud to announce a number of +exciting changes. + +One of the main concepts that makes Apache Flink stand out is the unification of +batch (aka bounded) and stream (aka unbounded) data processing, which helps reduce the complexity of development. A lot of +effort went into this unification in the previous releases, and you can expect more efforts in this direction. + +Apache Flink is not only growing when it comes to contributions and users, but +also out of the original use cases. We are seeing a trend towards more business/analytics +use cases implemented in low-/no-code. Flink SQL is the feature in the Flink ecosystem +that enables such uses cases and this is why its popularity continues to grow. + +Apache Flink is an essential building block in data pipelines/architectures and +is used with many other technologies in order to drive all sorts of use cases. While new ideas/products +may appear in this domain, existing technologies continue to establish themselves as standards for solving +mission-critical problems. Knowing that we have such a wide reach and play a role in the success of many +projects, it is important that the experience of +integrating Apache Flink with the cloud infrastructures and
[flink-web] branch asf-site updated (05803f97b -> d633c03d4)
This is an automated email from the ASF dual-hosted git repository. gaoyunhaii pushed a change to branch asf-site in repository https://gitbox.apache.org/repos/asf/flink-web.git from 05803f97b Rebuild website new cc1df7a78 Announcement blogpost for the 1.15 release new d633c03d4 Rebuild website The 2 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: _config.yml| 27 +- _posts/2022-05-05-1.15-announcement.md | 427 + content/2019/05/03/pulsar-flink.html | 2 +- content/2019/05/14/temporal-tables.html| 2 +- content/2019/05/19/state-ttl.html | 2 +- content/2019/06/05/flink-network-stack.html| 2 +- content/2019/06/26/broadcast-state.html| 2 +- content/2019/07/23/flink-network-stack-2.html | 2 +- content/2020/04/09/pyflink-udf-support-flink.html | 2 +- content/2020/07/23/catalogs.html | 2 +- ...ql-demo-building-e2e-streaming-application.html | 2 +- .../08/04/pyflink-pandas-udf-support-flink.html| 2 +- content/2020/08/19/statefun.html | 2 +- .../flink-1.11-memory-management-improvements.html | 2 +- ...om-aligned-to-unaligned-checkpoints-part-1.html | 2 +- content/2020/12/15/pipelined-region-sheduling.html | 2 +- content/2021/01/07/pulsar-flink-connector-270.html | 2 +- content/2021/01/18/rocksdb.html| 2 +- content/2021/02/10/native-k8s-with-ha.html | 2 +- content/2021/03/11/batch-execution-mode.html | 2 +- content/2021/05/06/reactive-mode.html | 2 +- content/2021/07/07/backpressure.html | 2 +- .../2021/09/07/connector-table-sql-api-part1.html | 2 +- .../2021/09/07/connector-table-sql-api-part2.html | 2 +- content/2021/10/26/sort-shuffle-part1.html | 2 +- content/2021/10/26/sort-shuffle-part2.html | 2 +- content/2021/11/03/flink-backward.html | 2 +- content/2021/12/10/log4j-cve.html | 2 +- .../2022/01/04/scheduler-performance-part-one.html | 2 +- .../2022/01/04/scheduler-performance-part-two.html | 2 +- content/2022/01/20/pravega-connector-101.html | 2 +- content/2022/02/22/scala-free.html | 2 +- content/blog/feed.xml | 544 - content/blog/index.html| 44 +- content/blog/page10/index.html | 38 +- content/blog/page11/index.html | 40 +- content/blog/page12/index.html | 42 +- content/blog/page13/index.html | 42 +- content/blog/page14/index.html | 42 +- content/blog/page15/index.html | 42 +- content/blog/page16/index.html | 41 +- content/blog/page17/index.html | 44 +- content/blog/page18/index.html | 30 +- content/blog/page2/index.html | 40 +- content/blog/page3/index.html | 40 +- content/blog/page4/index.html | 38 +- content/blog/page5/index.html | 38 +- content/blog/page6/index.html | 40 +- content/blog/page7/index.html | 40 +- content/blog/page8/index.html | 38 +- content/blog/page9/index.html | 38 +- .../blog/release_1.0.0-changelog_known_issues.html | 2 +- content/blog/release_1.1.0-changelog.html | 2 +- content/blog/release_1.2.0-changelog.html | 2 +- content/blog/release_1.3.0-changelog.html | 2 +- content/community.html | 2 +- .../code-style-and-quality-common.html | 2 +- .../code-style-and-quality-components.html | 2 +- .../code-style-and-quality-formatting.html | 2 +- .../contributing/code-style-and-quality-java.html | 2 +- .../code-style-and-quality-preamble.html | 2 +- .../code-style-and-quality-pull-requests.html | 2 +- .../contributing/code-style-and-quality-scala.html | 2 +- content/contributing/contribute-code.html | 2 +- content/contributing/contribute-documentation.html | 2 +- content/contributing/docs-style.html | 2 +- content/contributing/how-to-contribute.html| 2 +- content/contributing/improve-website.html | 2 +- content/contributing/reviewing-prs.html| 2 +- content/documentation.html | 2 +- content/downloads.html | 47 +- content/ecosystem.html | 2 +- .../apache-bea
[flink] branch release-1.15 updated: release notes for the 1.15 release
This is an automated email from the ASF dual-hosted git repository. gaoyunhaii pushed a commit to branch release-1.15 in repository https://gitbox.apache.org/repos/asf/flink.git The following commit(s) were added to refs/heads/release-1.15 by this push: new 23eafcd6ac6 release notes for the 1.15 release 23eafcd6ac6 is described below commit 23eafcd6ac6c8461a3e1b3dd197e059d50b93ae9 Author: Joe Moser AuthorDate: Tue Apr 26 11:47:16 2022 +0200 release notes for the 1.15 release --- docs/content/_index.md | 2 +- docs/content/release-notes/flink-1.15.md | 589 +++ 2 files changed, 590 insertions(+), 1 deletion(-) diff --git a/docs/content/_index.md b/docs/content/_index.md index 038d8048acc..2af8e6555ea 100644 --- a/docs/content/_index.md +++ b/docs/content/_index.md @@ -85,7 +85,7 @@ Release notes cover important changes between Flink versions. Please read them c For some reason Hugo will only allow linking to the release notes if there is a leading '/' and file extension. --> -See the release notes for [Flink 1.14]({{< ref "/release-notes/flink-1.14.md" >}}), [Flink 1.13]({{< ref "/release-notes/flink-1.13.md" >}}), [Flink 1.12]({{< ref "/release-notes/flink-1.12.md" >}}), [Flink 1.11]({{< ref "/release-notes/flink-1.11.md" >}}), [Flink 1.10]({{< ref "/release-notes/flink-1.10.md" >}}), [Flink 1.9]({{< ref "/release-notes/flink-1.9.md" >}}), [Flink 1.8]({{< ref "/release-notes/flink-1.8.md" >}}), or [Flink 1.7]({{< ref "/release-notes/flink-1.7.md" >}}). +See the release notes for [Flink 1.15]({{< ref "/release-notes/flink-1.15.md" >}}), [Flink 1.14]({{< ref "/release-notes/flink-1.14.md" >}}), [Flink 1.13]({{< ref "/release-notes/flink-1.13.md" >}}), [Flink 1.12]({{< ref "/release-notes/flink-1.12.md" >}}), [Flink 1.11]({{< ref "/release-notes/flink-1.11.md" >}}), [Flink 1.10]({{< ref "/release-notes/flink-1.10.md" >}}), [Flink 1.9]({{< ref "/release-notes/flink-1.9.md" >}}), [Flink 1.8]({{< ref "/release-notes/flink-1.8.md" >}}), or [Fl [...] {{< /columns >}} diff --git a/docs/content/release-notes/flink-1.15.md b/docs/content/release-notes/flink-1.15.md new file mode 100644 index 000..d9669cdff67 --- /dev/null +++ b/docs/content/release-notes/flink-1.15.md @@ -0,0 +1,589 @@ +--- +title: "Release Notes - Flink 1.15" +--- + + +# Release notes - Flink 1.15 + +These release notes discuss important aspects, such as configuration, behavior, +or dependencies, that changed between Flink 1.14 and Flink 1.15. Please read these +notes carefully if you are planning to upgrade your Flink version to 1.15. + +## Summary of changed dependency names + +There are Several changes in Flink 1.15 that require updating dependency names when +upgrading from earlier versions, mainly including the effort to opting-out Scala dependencies +from non-scala modules and reorganize table modules. A quick checklist of the dependency changes +is as follows: + +* Any dependency to one of the following modules needs to be updated to no longer include a suffix: + +``` +flink-cep +flink-clients +flink-connector-elasticsearch-base +flink-connector-elasticsearch6 +flink-connector-elasticsearch7 +flink-connector-gcp-pubsub +flink-connector-hbase-1.4 +flink-connector-hbase-2.2 +flink-connector-hbase-base +flink-connector-jdbc +flink-connector-kafka +flink-connector-kinesis +flink-connector-nifi +flink-connector-pulsar +flink-connector-rabbitmq +flink-container +flink-dstl-dfs +flink-gelly +flink-hadoop-bulk +flink-kubernetes +flink-runtime-web +flink-sql-connector-elasticsearch6 +flink-sql-connector-elasticsearch7 +flink-sql-connector-hbase-1.4 +flink-sql-connector-hbase-2.2 +flink-sql-connector-kafka +flink-sql-connector-kinesis +flink-sql-connector-rabbitmq +flink-state-processor-api +flink-statebackend-rocksdb +flink-streaming-java +flink-test-utils +flink-yarn +flink-table-api-java-bridge +flink-table-runtime +flink-sql-client +flink-orc +flink-orc-nohive +flink-parquet +``` +* For Table / SQL users, the new module `flink-table-planner-loader` replaces `flink-table-planner_2.12` + and avoids the need for a Scala suffix. For backwards compatibility, users can still + swap it with `flink-table-planner_2.12` located in `opt/`. + `flink-table-uber` has been split into `flink-table-api-java-uber`, + `flink-table-planner(-loader)`, and `flink-table-runtime`. Scala users need to explicitly add a dependency + to `flink-table-api-scala` or `flink-table-api-scala-bridge`. + +The detail of the involved issues are listed as follows. + + Add support for opting-out of Scala + +# [FLINK-20845](https://issues.apache.org/jira/browse/FLINK-20845) + +The Java DataSet/-Stream APIs are now independent of Scala and no longer transitively depend on it. + +The implications are the
[flink] branch master updated: release notes for the 1.15 release
This is an automated email from the ASF dual-hosted git repository. gaoyunhaii pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/flink.git The following commit(s) were added to refs/heads/master by this push: new a66046adaf6 release notes for the 1.15 release a66046adaf6 is described below commit a66046adaf6a96cd09e55ac1a4cf1df63e502708 Author: Joe Moser AuthorDate: Tue Apr 26 11:47:16 2022 +0200 release notes for the 1.15 release --- docs/content/_index.md | 2 +- docs/content/release-notes/flink-1.15.md | 589 +++ 2 files changed, 590 insertions(+), 1 deletion(-) diff --git a/docs/content/_index.md b/docs/content/_index.md index 038d8048acc..2af8e6555ea 100644 --- a/docs/content/_index.md +++ b/docs/content/_index.md @@ -85,7 +85,7 @@ Release notes cover important changes between Flink versions. Please read them c For some reason Hugo will only allow linking to the release notes if there is a leading '/' and file extension. --> -See the release notes for [Flink 1.14]({{< ref "/release-notes/flink-1.14.md" >}}), [Flink 1.13]({{< ref "/release-notes/flink-1.13.md" >}}), [Flink 1.12]({{< ref "/release-notes/flink-1.12.md" >}}), [Flink 1.11]({{< ref "/release-notes/flink-1.11.md" >}}), [Flink 1.10]({{< ref "/release-notes/flink-1.10.md" >}}), [Flink 1.9]({{< ref "/release-notes/flink-1.9.md" >}}), [Flink 1.8]({{< ref "/release-notes/flink-1.8.md" >}}), or [Flink 1.7]({{< ref "/release-notes/flink-1.7.md" >}}). +See the release notes for [Flink 1.15]({{< ref "/release-notes/flink-1.15.md" >}}), [Flink 1.14]({{< ref "/release-notes/flink-1.14.md" >}}), [Flink 1.13]({{< ref "/release-notes/flink-1.13.md" >}}), [Flink 1.12]({{< ref "/release-notes/flink-1.12.md" >}}), [Flink 1.11]({{< ref "/release-notes/flink-1.11.md" >}}), [Flink 1.10]({{< ref "/release-notes/flink-1.10.md" >}}), [Flink 1.9]({{< ref "/release-notes/flink-1.9.md" >}}), [Flink 1.8]({{< ref "/release-notes/flink-1.8.md" >}}), or [Fl [...] {{< /columns >}} diff --git a/docs/content/release-notes/flink-1.15.md b/docs/content/release-notes/flink-1.15.md new file mode 100644 index 000..d9669cdff67 --- /dev/null +++ b/docs/content/release-notes/flink-1.15.md @@ -0,0 +1,589 @@ +--- +title: "Release Notes - Flink 1.15" +--- + + +# Release notes - Flink 1.15 + +These release notes discuss important aspects, such as configuration, behavior, +or dependencies, that changed between Flink 1.14 and Flink 1.15. Please read these +notes carefully if you are planning to upgrade your Flink version to 1.15. + +## Summary of changed dependency names + +There are Several changes in Flink 1.15 that require updating dependency names when +upgrading from earlier versions, mainly including the effort to opting-out Scala dependencies +from non-scala modules and reorganize table modules. A quick checklist of the dependency changes +is as follows: + +* Any dependency to one of the following modules needs to be updated to no longer include a suffix: + +``` +flink-cep +flink-clients +flink-connector-elasticsearch-base +flink-connector-elasticsearch6 +flink-connector-elasticsearch7 +flink-connector-gcp-pubsub +flink-connector-hbase-1.4 +flink-connector-hbase-2.2 +flink-connector-hbase-base +flink-connector-jdbc +flink-connector-kafka +flink-connector-kinesis +flink-connector-nifi +flink-connector-pulsar +flink-connector-rabbitmq +flink-container +flink-dstl-dfs +flink-gelly +flink-hadoop-bulk +flink-kubernetes +flink-runtime-web +flink-sql-connector-elasticsearch6 +flink-sql-connector-elasticsearch7 +flink-sql-connector-hbase-1.4 +flink-sql-connector-hbase-2.2 +flink-sql-connector-kafka +flink-sql-connector-kinesis +flink-sql-connector-rabbitmq +flink-state-processor-api +flink-statebackend-rocksdb +flink-streaming-java +flink-test-utils +flink-yarn +flink-table-api-java-bridge +flink-table-runtime +flink-sql-client +flink-orc +flink-orc-nohive +flink-parquet +``` +* For Table / SQL users, the new module `flink-table-planner-loader` replaces `flink-table-planner_2.12` + and avoids the need for a Scala suffix. For backwards compatibility, users can still + swap it with `flink-table-planner_2.12` located in `opt/`. + `flink-table-uber` has been split into `flink-table-api-java-uber`, + `flink-table-planner(-loader)`, and `flink-table-runtime`. Scala users need to explicitly add a dependency + to `flink-table-api-scala` or `flink-table-api-scala-bridge`. + +The detail of the involved issues are listed as follows. + + Add support for opting-out of Scala + +# [FLINK-20845](https://issues.apache.org/jira/browse/FLINK-20845) + +The Java DataSet/-Stream APIs are now independent of Scala and no longer transitively depend on it. + +The implications are the following: +
[flink] branch master updated (4c8f323d9ec -> a39fe96b964)
This is an automated email from the ASF dual-hosted git repository. martijnvisser pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/flink.git from 4c8f323d9ec [FLINK-27465] Handle conversion of negative long to timestamp in AvroRowDeserializationSchema add a39fe96b964 [FLINK-24766] Ceiling/flooring dates to day returns wrong results. This includes handling millis for timestamps. This closes #17677 No new revisions were added by this update. Summary of changes: .../apache/flink/table/utils/DateTimeUtils.java| 3 + .../planner/codegen/calls/FloorCeilCallGen.scala | 45 ++- .../planner/functions/TimeFunctionsITCase.java | 311 - .../planner/expressions/TemporalTypesTest.scala| 10 +- 4 files changed, 354 insertions(+), 15 deletions(-)
[flink] branch master updated (a112465efe0 -> 4c8f323d9ec)
This is an automated email from the ASF dual-hosted git repository. thw pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/flink.git from a112465efe0 [FLINK-27442][Formats][Avro Confluent] Add Confluent repo to module flink-sql-avro-confluent-registry add 4c8f323d9ec [FLINK-27465] Handle conversion of negative long to timestamp in AvroRowDeserializationSchema No new revisions were added by this update. Summary of changes: .../org/apache/flink/formats/avro/AvroRowDeserializationSchema.java | 5 + 1 file changed, 5 insertions(+)
[flink] branch master updated: [FLINK-27442][Formats][Avro Confluent] Add Confluent repo to module flink-sql-avro-confluent-registry
This is an automated email from the ASF dual-hosted git repository. martijnvisser pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/flink.git The following commit(s) were added to refs/heads/master by this push: new a112465efe0 [FLINK-27442][Formats][Avro Confluent] Add Confluent repo to module flink-sql-avro-confluent-registry a112465efe0 is described below commit a112465efe0e32f6c6c5e5e433b8d4b9f90dfd79 Author: MartijnVisser AuthorDate: Wed May 4 15:24:51 2022 +0200 [FLINK-27442][Formats][Avro Confluent] Add Confluent repo to module flink-sql-avro-confluent-registry --- flink-formats/flink-sql-avro-confluent-registry/pom.xml | 7 +++ 1 file changed, 7 insertions(+) diff --git a/flink-formats/flink-sql-avro-confluent-registry/pom.xml b/flink-formats/flink-sql-avro-confluent-registry/pom.xml index 2ce0a31d000..123311900af 100644 --- a/flink-formats/flink-sql-avro-confluent-registry/pom.xml +++ b/flink-formats/flink-sql-avro-confluent-registry/pom.xml @@ -33,6 +33,13 @@ under the License. jar + + + confluent + https://packages.confluent.io/maven/ + + + org.apache.flink
[flink-kubernetes-operator] branch main updated: [FLINK-27303] Improve config cache settings + add cleanup
This is an automated email from the ASF dual-hosted git repository. gyfora pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/flink-kubernetes-operator.git The following commit(s) were added to refs/heads/main by this push: new 86a0d39 [FLINK-27303] Improve config cache settings + add cleanup 86a0d39 is described below commit 86a0d396866cbd0999b599a1e2088ca765e81bae Author: Gyula Fora AuthorDate: Tue May 3 14:13:06 2022 +0200 [FLINK-27303] Improve config cache settings + add cleanup --- .../kubernetes_operator_config_configuration.html | 12 ++ .../operator/config/FlinkConfigManager.java| 27 ++ .../config/KubernetesOperatorConfigOptions.java| 12 ++ 3 files changed, 42 insertions(+), 9 deletions(-) diff --git a/docs/layouts/shortcodes/generated/kubernetes_operator_config_configuration.html b/docs/layouts/shortcodes/generated/kubernetes_operator_config_configuration.html index 61bef00..678a649 100644 --- a/docs/layouts/shortcodes/generated/kubernetes_operator_config_configuration.html +++ b/docs/layouts/shortcodes/generated/kubernetes_operator_config_configuration.html @@ -8,6 +8,18 @@ + +kubernetes.operator.config.cache.size +1000 +Integer +Max config cache size. + + +kubernetes.operator.config.cache.timeout +10 min +Duration +Expiration time for cached configs. + kubernetes.operator.deployment.readiness.timeout 1 min diff --git a/flink-kubernetes-operator/src/main/java/org/apache/flink/kubernetes/operator/config/FlinkConfigManager.java b/flink-kubernetes-operator/src/main/java/org/apache/flink/kubernetes/operator/config/FlinkConfigManager.java index da97fef..dfcfc9e 100644 --- a/flink-kubernetes-operator/src/main/java/org/apache/flink/kubernetes/operator/config/FlinkConfigManager.java +++ b/flink-kubernetes-operator/src/main/java/org/apache/flink/kubernetes/operator/config/FlinkConfigManager.java @@ -43,6 +43,7 @@ import org.slf4j.LoggerFactory; import java.time.Duration; import java.util.Set; import java.util.concurrent.Executors; +import java.util.concurrent.ScheduledExecutorService; import java.util.concurrent.TimeUnit; import java.util.concurrent.atomic.AtomicLong; @@ -55,9 +56,6 @@ public class FlinkConfigManager { private static final Logger LOG = LoggerFactory.getLogger(FlinkConfigManager.class); private static final ObjectMapper objectMapper = new ObjectMapper(); -private static final int MAX_CACHE_SIZE = 1000; -private static final Duration CACHE_TIMEOUT = Duration.ofMinutes(30); - private volatile Configuration defaultConfig; private volatile FlinkOperatorConfiguration operatorConfiguration; private final AtomicLong defaultConfigVersion = new AtomicLong(0); @@ -70,10 +68,14 @@ public class FlinkConfigManager { } public FlinkConfigManager(Configuration defaultConfig) { +Duration cacheTimeout = + defaultConfig.get(KubernetesOperatorConfigOptions.OPERATOR_CONFIG_CACHE_TIMEOUT); this.cache = CacheBuilder.newBuilder() -.maximumSize(MAX_CACHE_SIZE) -.expireAfterAccess(CACHE_TIMEOUT) +.maximumSize( +defaultConfig.get( + KubernetesOperatorConfigOptions.OPERATOR_CONFIG_CACHE_SIZE)) +.expireAfterAccess(cacheTimeout) .removalListener( removalNotification -> FlinkConfigBuilder.cleanupTmpFiles( @@ -87,8 +89,15 @@ public class FlinkConfigManager { }); updateDefaultConfig(defaultConfig); +ScheduledExecutorService executorService = Executors.newSingleThreadScheduledExecutor(); +executorService.scheduleWithFixedDelay( +cache::cleanUp, +cacheTimeout.toMillis(), +cacheTimeout.toMillis(), +TimeUnit.MILLISECONDS); + if (defaultConfig.getBoolean(OPERATOR_DYNAMIC_CONFIG_ENABLED)) { -scheduleConfigWatcher(); +scheduleConfigWatcher(executorService); } } @@ -151,11 +160,11 @@ public class FlinkConfigManager { } } -private void scheduleConfigWatcher() { +private void scheduleConfigWatcher(ScheduledExecutorService executorService) { var checkInterval = defaultConfig.get(OPERATOR_DYNAMIC_CONFIG_CHECK_INTERVAL); var millis = checkInterval.toMillis(); -Executors.newSingleThreadScheduledExecutor() -.scheduleAtFixedRate(new ConfigUpdater(), millis, millis, TimeUnit.MILLISECONDS); +executorService.scheduleAtFixedRate( +
[flink] branch master updated (eee2ab7065d -> b98c66cfe44)
This is an automated email from the ASF dual-hosted git repository. chesnay pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/flink.git from eee2ab7065d [hotfix][docs] Fix class name in docs for ExecutionEnvironment class add b98c66cfe44 [FLINK-26496][yarn][tests] Migrate tests to JUnit5 No new revisions were added by this update. Summary of changes: .../flink/test/util/SecureTestEnvironment.java | 8 +- .../flink/yarn/CliFrontendRunWithYarnTest.java | 28 ++- .../test/java/org/apache/flink/yarn/UtilsTest.java | 43 ++-- .../apache/flink/yarn/YARNApplicationITCase.java | 15 +- .../flink/yarn/YARNFileReplicationITCase.java | 27 ++- .../flink/yarn/YARNHighAvailabilityITCase.java | 102 + .../java/org/apache/flink/yarn/YARNITCase.java | 56 ++--- .../yarn/YARNSessionCapacitySchedulerITCase.java | 236 + .../apache/flink/yarn/YARNSessionFIFOITCase.java | 76 +++ .../flink/yarn/YARNSessionFIFOSecuredITCase.java | 71 +++ .../apache/flink/yarn/YarnConfigurationITCase.java | 56 +++-- .../flink/yarn/YarnPrioritySchedulingITCase.java | 21 +- .../java/org/apache/flink/yarn/YarnTestBase.java | 117 +- .../org/apache/flink/yarn/YarnTestBaseTest.java| 11 +- .../org.junit.jupiter.api.extension.Extension | 0 15 files changed, 393 insertions(+), 474 deletions(-) copy {flink-docs => flink-yarn-tests}/src/test/resources/META-INF/services/org.junit.jupiter.api.extension.Extension (100%)
[flink] branch master updated: [hotfix][docs] Fix class name in docs for ExecutionEnvironment class
This is an automated email from the ASF dual-hosted git repository. martijnvisser pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/flink.git The following commit(s) were added to refs/heads/master by this push: new eee2ab7065d [hotfix][docs] Fix class name in docs for ExecutionEnvironment class eee2ab7065d is described below commit eee2ab7065d869bad7c7b05caa61403c1dfa64d5 Author: Sergey Nuyanzin AuthorDate: Fri Apr 22 14:30:51 2022 +0200 [hotfix][docs] Fix class name in docs for ExecutionEnvironment class --- docs/content.zh/docs/connectors/datastream/formats/hadoop.md | 2 +- docs/content.zh/docs/dev/dataset/hadoop_compatibility.md | 2 +- docs/content/docs/connectors/dataset/formats/hadoop.md | 2 +- docs/content/docs/connectors/datastream/formats/hadoop.md| 2 +- 4 files changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/content.zh/docs/connectors/datastream/formats/hadoop.md b/docs/content.zh/docs/connectors/datastream/formats/hadoop.md index bd19d3f0df3..b5e63b0224e 100644 --- a/docs/content.zh/docs/connectors/datastream/formats/hadoop.md +++ b/docs/content.zh/docs/connectors/datastream/formats/hadoop.md @@ -57,7 +57,7 @@ under the License. 在 Flink 中使用 Hadoop `InputFormats`,必须首先使用 `HadoopInputs` 工具类的 `readHadoopFile` 或 `createHadoopInput` 包装 Input Format。 前者用于从 `FileInputFormat` 派生的 Input Format,而后者必须用于通用的 Input Format。 -生成的 `InputFormat` 可通过使用 `ExecutionEnvironmen#createInput` 创建数据源。 +生成的 `InputFormat` 可通过使用 `ExecutionEnvironment#createInput` 创建数据源。 生成的 `DataStream` 包含 2 元组,其中第一个字段是键,第二个字段是从 Hadoop `InputFormat` 接收的值。 diff --git a/docs/content.zh/docs/dev/dataset/hadoop_compatibility.md b/docs/content.zh/docs/dev/dataset/hadoop_compatibility.md index 737a78a6964..1007627d25a 100644 --- a/docs/content.zh/docs/dev/dataset/hadoop_compatibility.md +++ b/docs/content.zh/docs/dev/dataset/hadoop_compatibility.md @@ -88,7 +88,7 @@ The former is used for input formats derived from `FileInputFormat` while the latter has to be used for general purpose input formats. The resulting `InputFormat` can be used to create a data source by using -`ExecutionEnvironmen#createInput`. +`ExecutionEnvironment#createInput`. The resulting `DataSet` contains 2-tuples where the first field is the key and the second field is the value retrieved from the Hadoop diff --git a/docs/content/docs/connectors/dataset/formats/hadoop.md b/docs/content/docs/connectors/dataset/formats/hadoop.md index be5205472c5..4a1160d562c 100644 --- a/docs/content/docs/connectors/dataset/formats/hadoop.md +++ b/docs/content/docs/connectors/dataset/formats/hadoop.md @@ -63,7 +63,7 @@ The former is used for input formats derived from `FileInputFormat` while the latter has to be used for general purpose input formats. The resulting `InputFormat` can be used to create a data source by using -`ExecutionEnvironmen#createInput`. +`ExecutionEnvironment#createInput`. The resulting `DataSet` contains 2-tuples where the first field is the key and the second field is the value retrieved from the Hadoop diff --git a/docs/content/docs/connectors/datastream/formats/hadoop.md b/docs/content/docs/connectors/datastream/formats/hadoop.md index e2b2c9fd857..d8b682402c8 100644 --- a/docs/content/docs/connectors/datastream/formats/hadoop.md +++ b/docs/content/docs/connectors/datastream/formats/hadoop.md @@ -64,7 +64,7 @@ The former is used for input formats derived from `FileInputFormat` while the latter has to be used for general purpose input formats. The resulting `InputFormat` can be used to create a data source by using -`ExecutionEnvironmen#createInput`. +`ExecutionEnvironment#createInput`. The resulting `DataStream` contains 2-tuples where the first field is the key and the second field is the value retrieved from the Hadoop
[flink-web] 01/02: [hotfix] Correct the last updated date for the roadmap
This is an automated email from the ASF dual-hosted git repository. martijnvisser pushed a commit to branch asf-site in repository https://gitbox.apache.org/repos/asf/flink-web.git commit 3e6604a1fad0dff100dc806be82bef78b9cb0e56 Author: MartijnVisser AuthorDate: Wed May 4 12:47:25 2022 +0200 [hotfix] Correct the last updated date for the roadmap --- roadmap.md| 2 +- roadmap.zh.md | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/roadmap.md b/roadmap.md index edeb4ac9b..c89ffb835 100644 --- a/roadmap.md +++ b/roadmap.md @@ -36,7 +36,7 @@ More details and various smaller changes can be found in the The roadmap is continuously updated. New features and efforts should be added to the roadmap once there is consensus that they will happen and what they will roughly look like for the user. -**Last Update:** 2021-09-16 +**Last Update:** 2022-04-19 diff --git a/roadmap.zh.md b/roadmap.zh.md index 9b6b833bb..7750ee046 100644 --- a/roadmap.zh.md +++ b/roadmap.zh.md @@ -35,7 +35,7 @@ under the License. 路线图会不断更新。一旦达成共识,新的特性和工作都会添加到路线图中。 这里的共识是指这些特性和工作将来确定会发生,以及这些工作对于用户来说大致是什么样的。 -**Last Update:** 2021-04-06 +**Last Update:** 2022-04-19
[flink-web] 02/02: Rebuild website
This is an automated email from the ASF dual-hosted git repository. martijnvisser pushed a commit to branch asf-site in repository https://gitbox.apache.org/repos/asf/flink-web.git commit 05803f97b2d7dafee16bac26846725b43ba65b8c Author: MartijnVisser AuthorDate: Wed May 4 12:48:50 2022 +0200 Rebuild website --- content/roadmap.html| 2 +- content/zh/roadmap.html | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/content/roadmap.html b/content/roadmap.html index c4a1b7763..3d502bc5c 100644 --- a/content/roadmap.html +++ b/content/roadmap.html @@ -284,7 +284,7 @@ efforts, so that users may get a better impression of the overall status and dir The roadmap is continuously updated. New features and efforts should be added to the roadmap once there is consensus that they will happen and what they will roughly look like for the user. -Last Update: 2021-09-16 +Last Update: 2022-04-19 diff --git a/content/zh/roadmap.html b/content/zh/roadmap.html index fb26977f9..bcec24a6a 100644 --- a/content/zh/roadmap.html +++ b/content/zh/roadmap.html @@ -281,7 +281,7 @@ under the License. 路线图会不断更新。一旦达成共识,新的特性和工作都会添加到路线图中。 这里的共识是指这些特性和工作将来确定会发生,以及这些工作对于用户来说大致是什么样的。 -Last Update: 2021-04-06 +Last Update: 2022-04-19
[flink-web] branch asf-site updated (a71013959 -> 05803f97b)
This is an automated email from the ASF dual-hosted git repository. martijnvisser pushed a change to branch asf-site in repository https://gitbox.apache.org/repos/asf/flink-web.git from a71013959 Rebuild website new 3e6604a1f [hotfix] Correct the last updated date for the roadmap new 05803f97b Rebuild website The 2 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: content/roadmap.html| 2 +- content/zh/roadmap.html | 2 +- roadmap.md | 2 +- roadmap.zh.md | 2 +- 4 files changed, 4 insertions(+), 4 deletions(-)
[flink] branch master updated: [FLINK-27485][docs] Fix documentation build pipeline Fix the documentation build pipeline by 1) using a different Git command (which is supported by the installed Git ve
This is an automated email from the ASF dual-hosted git repository. chesnay pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/flink.git The following commit(s) were added to refs/heads/master by this push: new 22a519d4055 [FLINK-27485][docs] Fix documentation build pipeline Fix the documentation build pipeline by 1) using a different Git command (which is supported by the installed Git version on the Docker image) and 2) upgrading Hugo and making sure that this is added to the PATH (#19632) 22a519d4055 is described below commit 22a519d405549d7a53f02697b850929100399872 Author: MartijnVisser AuthorDate: Wed May 4 12:30:39 2022 +0200 [FLINK-27485][docs] Fix documentation build pipeline Fix the documentation build pipeline by 1) using a different Git command (which is supported by the installed Git version on the Docker image) and 2) upgrading Hugo and making sure that this is added to the PATH (#19632) - use compatible command to get the current git branch - put hugo onto PATH --- .github/workflows/docs.sh | 10 +- docs/setup_docs.sh| 2 +- 2 files changed, 6 insertions(+), 6 deletions(-) diff --git a/.github/workflows/docs.sh b/.github/workflows/docs.sh index abf0497e94c..ed970140dd4 100755 --- a/.github/workflows/docs.sh +++ b/.github/workflows/docs.sh @@ -23,23 +23,23 @@ java -version javadoc -J-version # setup hugo -HUGO_REPO=https://github.com/gohugoio/hugo/releases/download/v0.80.0/hugo_extended_0.80.0_Linux-64bit.tar.gz -HUGO_ARTIFACT=hugo_extended_0.80.0_Linux-64bit.tar.gz +HUGO_REPO=https://github.com/gohugoio/hugo/releases/download/v0.98.0/hugo_extended_0.98.0_Linux-64bit.tar.gz +HUGO_ARTIFACT=hugo_extended_0.98.0_Linux-64bit.tar.gz if ! curl --fail -OL $HUGO_REPO ; then echo "Failed to download Hugo binary" exit 1 fi -tar -zxvf $HUGO_ARTIFACT +tar -zxvf $HUGO_ARTIFACT -C /usr/local/bin git submodule update --init --recursive # Setup the external documentation modules cd docs source setup_docs.sh cd .. # Build the docs -./hugo --source docs +hugo --source docs # generate docs into docs/target -./hugo -v --source docs --destination target +hugo -v --source docs --destination target if [ $? -ne 0 ]; then echo "Error building the docs" exit 1 diff --git a/docs/setup_docs.sh b/docs/setup_docs.sh index 01230c1c7a2..34f3d59d99c 100755 --- a/docs/setup_docs.sh +++ b/docs/setup_docs.sh @@ -33,7 +33,7 @@ EOF echo "Created temporary file" $goModFileLocation/go.mod # Make Hugo retrieve modules which are used for externally hosted documentation -currentBranch=$(git branch --show-current) +currentBranch=$(git rev-parse --abbrev-ref HEAD) if [[ ! "$currentBranch" =~ ^release- ]] || [[ -z "$currentBranch" ]]; then # If the current branch is master or not provided, get the documentation from the main branch
[flink] branch release-1.15 updated: [FLINK-27368][table-planner] Trim casts from character string to numeric
This is an automated email from the ASF dual-hosted git repository. twalthr pushed a commit to branch release-1.15 in repository https://gitbox.apache.org/repos/asf/flink.git The following commit(s) were added to refs/heads/release-1.15 by this push: new b1b58ee4c2e [FLINK-27368][table-planner] Trim casts from character string to numeric b1b58ee4c2e is described below commit b1b58ee4c2e61bc6f8e9c4d356708c1e080b3ee9 Author: Timo Walther AuthorDate: Mon Apr 25 09:39:29 2022 +0200 [FLINK-27368][table-planner] Trim casts from character string to numeric This closes #19565. --- .../casting/StringToNumericPrimitiveCastRule.java | 14 -- .../table/planner/functions/casting/CastRulesTest.java| 15 +++ 2 files changed, 23 insertions(+), 6 deletions(-) diff --git a/flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/functions/casting/StringToNumericPrimitiveCastRule.java b/flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/functions/casting/StringToNumericPrimitiveCastRule.java index 314930247a0..a0ba3698bae 100644 --- a/flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/functions/casting/StringToNumericPrimitiveCastRule.java +++ b/flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/functions/casting/StringToNumericPrimitiveCastRule.java @@ -28,6 +28,7 @@ import static org.apache.flink.table.planner.codegen.calls.BuiltInMethods.STRING import static org.apache.flink.table.planner.codegen.calls.BuiltInMethods.STRING_DATA_TO_INT; import static org.apache.flink.table.planner.codegen.calls.BuiltInMethods.STRING_DATA_TO_LONG; import static org.apache.flink.table.planner.codegen.calls.BuiltInMethods.STRING_DATA_TO_SHORT; +import static org.apache.flink.table.planner.functions.casting.CastRuleUtils.methodCall; import static org.apache.flink.table.planner.functions.casting.CastRuleUtils.staticCall; /** @@ -54,19 +55,20 @@ class StringToNumericPrimitiveCastRule String inputTerm, LogicalType inputLogicalType, LogicalType targetLogicalType) { +final String trimmedInputTerm = methodCall(inputTerm, "trim"); switch (targetLogicalType.getTypeRoot()) { case TINYINT: -return staticCall(STRING_DATA_TO_BYTE(), inputTerm); +return staticCall(STRING_DATA_TO_BYTE(), trimmedInputTerm); case SMALLINT: -return staticCall(STRING_DATA_TO_SHORT(), inputTerm); +return staticCall(STRING_DATA_TO_SHORT(), trimmedInputTerm); case INTEGER: -return staticCall(STRING_DATA_TO_INT(), inputTerm); +return staticCall(STRING_DATA_TO_INT(), trimmedInputTerm); case BIGINT: -return staticCall(STRING_DATA_TO_LONG(), inputTerm); +return staticCall(STRING_DATA_TO_LONG(), trimmedInputTerm); case FLOAT: -return staticCall(STRING_DATA_TO_FLOAT(), inputTerm); +return staticCall(STRING_DATA_TO_FLOAT(), trimmedInputTerm); case DOUBLE: -return staticCall(STRING_DATA_TO_DOUBLE(), inputTerm); +return staticCall(STRING_DATA_TO_DOUBLE(), trimmedInputTerm); } throw new IllegalArgumentException("This is a bug. Please file an issue."); } diff --git a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/functions/casting/CastRulesTest.java b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/functions/casting/CastRulesTest.java index f676f88ff8f..6b69667b2a6 100644 --- a/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/functions/casting/CastRulesTest.java +++ b/flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/functions/casting/CastRulesTest.java @@ -169,6 +169,7 @@ class CastRulesTest { .fail(STRING(), fromString("Apache"), TableException.class) .fromCase(STRING(), fromString("1.234"), (byte) 1) .fromCase(STRING(), fromString("123"), (byte) 123) +.fromCase(STRING(), fromString(" 123 "), (byte) 123) .fail(STRING(), fromString("-130"), TableException.class) .fromCase( DECIMAL(4, 3), @@ -203,6 +204,7 @@ class CastRulesTest { .fail(STRING(), fromString("Apache"), TableException.class) .fromCase(STRING(), fromString("1.234"), (short) 1) .fromCase(STRING(), fromString("123"), (short) 123) +.fromCase(STRING(), fromString(" 123 "), (short) 123) .fail(STRING(), fromString("-32769"), TableException.class) .fromCase(
[flink] branch master updated (90e98ba7c85 -> 7128a600654)
This is an automated email from the ASF dual-hosted git repository. twalthr pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/flink.git from 90e98ba7c85 [FLINK-27470][state][tests] Migrate test to JUnit5 add 7128a600654 [FLINK-27368][table-planner] Trim casts from character string to numeric No new revisions were added by this update. Summary of changes: .../casting/StringToNumericPrimitiveCastRule.java | 14 -- .../table/planner/functions/casting/CastRulesTest.java| 15 +++ 2 files changed, 23 insertions(+), 6 deletions(-)