[flink] branch master updated (2306e448bfa -> e962c696492)

2022-11-04 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


from 2306e448bfa [FLINK-29865][CI] Allow setting the JDK in 
build-nightly-dist.yml
 add e962c696492 [FLINK-15633][build] Fix building Javadocs on JDK 11

No new revisions were added by this update.

Summary of changes:
 pom.xml | 1 +
 1 file changed, 1 insertion(+)



[flink] branch master updated (d0c6075be47 -> 2306e448bfa)

2022-11-04 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


from d0c6075be47 [FLINK-29867][build] Update maven-enforcer-plugin to 3.1.0
 add 2306e448bfa [FLINK-29865][CI] Allow setting the JDK in 
build-nightly-dist.yml

No new revisions were added by this update.

Summary of changes:
 tools/azure-pipelines/build-apache-repo.yml  |  1 +
 tools/azure-pipelines/build-nightly-dist.yml | 11 +++
 2 files changed, 12 insertions(+)



[flink] branch master updated (00a25808dfa -> d0c6075be47)

2022-11-04 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


from 00a25808dfa [FLINK-29730][checkpoint] Do not support concurrent 
unaligned checkpoints in the ChannelStateWriteRequestDispatcherImpl
 add 614fc2a5fd3 [FLINK-29868] exclude org.osgi.core from snappy-java and 
commons-compress
 add d0c6075be47 [FLINK-29867][build] Update maven-enforcer-plugin to 3.1.0

No new revisions were added by this update.

Summary of changes:
 pom.xml | 16 +++-
 1 file changed, 15 insertions(+), 1 deletion(-)



[flink-training] branch master updated: [FLINK-26382] Add Chinese documents for flink-training exercises (#46)

2022-04-22 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/master by this push:
 new aca6c47  [FLINK-26382] Add Chinese documents for flink-training 
exercises (#46)
aca6c47 is described below

commit aca6c47b79d486eb38969492c7e2dc8cb200d146
Author: T.C 
AuthorDate: Fri Apr 22 17:09:26 2022 +0800

[FLINK-26382] Add Chinese documents for flink-training exercises (#46)

Co-authored-by: Victor Xu 
Co-authored-by: Nico Kruber 
---
 README.md   |   2 +
 README_zh.md| 274 
 build.gradle|   2 +-
 hourly-tips/DISCUSSION.md   |   2 +
 hourly-tips/{DISCUSSION.md => DISCUSSION_zh.md} |  46 ++--
 hourly-tips/README.md   |   2 +
 hourly-tips/README_zh.md|  82 +++
 long-ride-alerts/DISCUSSION.md  |   2 +
 long-ride-alerts/DISCUSSION_zh.md   |  50 +
 long-ride-alerts/README.md  |   2 +
 long-ride-alerts/{README.md => README_zh.md}|  68 +++---
 ride-cleansing/README.md|   2 +
 ride-cleansing/{README.md => README_zh.md}  |  55 ++---
 rides-and-fares/README.md   |   2 +
 rides-and-fares/README_zh.md|  95 
 15 files changed, 601 insertions(+), 85 deletions(-)

diff --git a/README.md b/README.md
index 0fc84e1..b1a65cb 100644
--- a/README.md
+++ b/README.md
@@ -17,6 +17,8 @@ specific language governing permissions and limitations
 under the License.
 -->
 
+[中文版](./README_zh.md)
+
 # Apache Flink Training Exercises
 
 Exercises that accompany the training content in the documentation.
diff --git a/README_zh.md b/README_zh.md
new file mode 100644
index 000..2af9bba
--- /dev/null
+++ b/README_zh.md
@@ -0,0 +1,274 @@
+
+
+# Apache Flink 实践练习
+
+与文档中实践练习内容相关的练习。
+
+## 目录
+
+[**设置开发环境**](#set-up-your-development-environment)
+
+1. [软件要求](#software-requirements)
+1. [克隆并构建 flink-training 项目](#clone-and-build-the-flink-training-project)
+1. [将 flink-training 项目导入 
IDE](#import-the-flink-training-project-into-your-ide)
+
+[**使用出租车数据流(taxi data stream)**](#using-the-taxi-data-streams)
+
+1. [出租车车程(taxi ride)事件结构](#schema-of-taxi-ride-events)
+1. [出租车费用(taxi fare)事件结构](#schema-of-taxi-fare-events)
+
+[**如何做练习**](#how-to-do-the-lab-exercises)
+
+1. [了解数据](#learn-about-the-data)
+2. [在 IDE 中运行和调试 Flink 程序](#run-and-debug-flink-programs-in-your-ide)
+3. [练习、测试及解决方案](#exercises-tests-and-solutions)
+
+[**练习**](#lab-exercises)
+
+[**提交贡献**](#contributing)
+
+[**许可证**](#license)
+
+
+
+## 设置开发环境
+
+你需要设置便于进行开发、调试并运行实践练习的示例和解决方案的环境。
+
+
+
+### 软件要求
+
+Linux、OS X 和 Windows 均可作为 Flink 程序和本地执行的开发环境。 Flink 开发设置需要以下软件,它们应该安装在系统上:
+
+- Git
+- Java 8 或者 Java 11 版本的 JDK (JRE不满足要求;目前不支持其他版本的Java)
+- 支持 Gradle 的 Java (及/或 Scala) 开发IDE
+- 推荐使用 [IntelliJ](https://www.jetbrains.com/idea/), 但 
[Eclipse](https://www.eclipse.org/downloads/) 或 [Visual Studio 
Code](https://code.visualstudio.com/) (安装 [Java extension 
pack](https://code.visualstudio.com/docs/java/java-tutorial) 插件) 也可以用于Java环境
+- 为了使用 Scala, 需要使用 IntelliJ (及其 [Scala 
plugin](https://plugins.jetbrains.com/plugin/1347-scala/) 插件)
+
+> **:information_source: Windows 用户须知:** 实践说明中提供的 shell 命令示例适用于 UNIX 环境。
+> 您可能会发现值得在 Windows 环境中设置 cygwin 或 WSL。对于开发 Flink 
作业(jobs),Windows工作的相当好:可以在单机上运行 Flink 集群、提交作业、运行 webUI 并在IDE中执行作业。
+
+
+
+### 克隆并构建 flink-training 项目
+
+`flink-training` 仓库包含编程练习的习题、测试和参考解决方案。
+
+> **:information_source: 仓库格局:** 本仓库有几个分支,分别指向不同的 Apache Flink 版本,类似于 
[apache/flink](https://github.com/apache/flink) 仓库:
+> - 每个 Apache Flink 次要版本的发布分支,例如 `release-1.10`,和
+> - 一个指向当前 Flink 版本的 `master` 分支(不是 `flink:master`!)
+>
+> 如果想在当前 Flink 版本以外的版本上工作,请务必签出相应的分支。
+
+从 GitHub 克隆出 `flink-training` 仓库,导航到本地项目仓库并构建它:
+
+```bash
+git clone https://github.com/apache/flink-training.git
+cd flink-training
+./gradlew test shadowJar
+```
+
+如果是第一次构建,将会下载此 Flink 练习项目的所有依赖项。这通常需要几分钟时间,但具体取决于互联网连接速度。
+
+如果所有测试都通过并且构建成功,这说明你的实践练习已经开了一个好头。
+
+
+:cn: 中国用户: 点击这里了解如何使用本地 Maven 镜像。
+
+如果你在中国,我们建议将 Maven 存储库配置为使用镜像。 可以通过在 [`build.gradle`](build.gradle) 
文件中取消注释此部分来做到这一点:
+
+```groovy
+repositories {
+// for access from China, you may need to uncomment this line
+maven { url 'https://maven.aliyun.com/repository/public/' }
+mavenCentral()
+maven {
+url "https://repository.apache.org/content/repositories/snapshots/;
+mavenContent {
+snapshotsOnly()
+}
+}
+}
+```
+
+
+
+启用 Scala (可选)
+这个项目中的练习也可以使用 Scala ,但由于非 Scala 用户报告的一些问题,我们决定默认禁用 Scala。
+可以通过以下的方法修改 `gradle.properties` 文件以重新启用所有 Scala 练习和解决方案:
+
+[`gradle.properties`](gradle.properties) 文件如下:
+
+``

[flink-training] branch release-1.14 updated: fixup! [FLINK-26382]Add Chinese documents for flink-training exercises (#46)

2022-04-22 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch release-1.14
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/release-1.14 by this push:
 new 18e6db2  fixup! [FLINK-26382]Add Chinese documents for flink-training 
exercises (#46)
18e6db2 is described below

commit 18e6db2206ca4156e21276b14d35bebaf222c151
Author: Nico Kruber 
AuthorDate: Fri Apr 22 11:15:17 2022 +0200

fixup! [FLINK-26382]Add Chinese documents for flink-training exercises (#46)
---
 long-ride-alerts/README_zh.md | 2 +-
 rides-and-fares/README_zh.md  | 6 +++---
 2 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/long-ride-alerts/README_zh.md b/long-ride-alerts/README_zh.md
index 85912da..3b1d060 100644
--- a/long-ride-alerts/README_zh.md
+++ b/long-ride-alerts/README_zh.md
@@ -44,7 +44,7 @@ END 事件可能会丢失,但你可以假设没有重复的事件,也没有
 ## 入门指南
 
 > :information_source: 最好在 IDE 的 flink-training 项目中找到这些类,而不是使用本节中源文件的链接。
-> 
+>
 ### 练习相关类
 
 - Java:  
[`org.apache.flink.training.exercises.longrides.LongRidesExercise`](src/main/java/org/apache/flink/training/exercises/longrides/LongRidesExercise.java)
diff --git a/rides-and-fares/README_zh.md b/rides-and-fares/README_zh.md
index a1dff45..f54852f 100644
--- a/rides-and-fares/README_zh.md
+++ b/rides-and-fares/README_zh.md
@@ -27,7 +27,7 @@ under the License.
 1. `TaxiRide` END 事件
 1. 一个 `TaxiFare` 事件(其时间戳恰好与开始时间匹配)
 
-最终的结果应该是 `DataStream`,每个不同的 `rideId` 都产生一个 `RideAndFare` 记录。 
+最终的结果应该是 `DataStream`,每个不同的 `rideId` 都产生一个 `RideAndFare` 记录。
 每个 `RideAndFare` 都应该将某个 `rideId` 的 `TaxiRide` START 事件与其匹配的 `TaxiFare` 配对。
 
 ### 输入数据
@@ -37,7 +37,7 @@ under the License.
 
 ### 期望输出
 
-所希望的结果是一个 `RideAndFare` 记录的数据流,每个不同的 `rideId` 都有一条这样的记录。 
+所希望的结果是一个 `RideAndFare` 记录的数据流,每个不同的 `rideId` 都有一条这样的记录。
 本练习设置为忽略 END 事件,你应该连接每次乘车的 START 事件及其相应的车费事件。
 
 一旦具有了相互关联的车程和车费事件,你可以使用 `new RideAndFare(ride, fare)` 方法为输出流创建所需的对象。
@@ -76,7 +76,7 @@ under the License.
 ## 讨论
 
 出于练习的目的,可以假设 START 和 fare 事件完美配对。
-但是在现实世界的应用程序中,你应该担心每当一个事件丢失时,同一个 `rideId` 的另一个事件的状态将被永远保持。 
+但是在现实世界的应用程序中,你应该担心每当一个事件丢失时,同一个 `rideId` 的另一个事件的状态将被永远保持。
 在 [稍后的练习](../long-ride-alerts/README_zh.md) 中,我们将看到 `ProcessFunction` 
和定时器,它们将有助于处理这样的情况。
 
 ## 相关文档



[flink-training] branch release-1.14 updated: [FLINK-26382]Add Chinese documents for flink-training exercises (#46)

2022-04-22 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch release-1.14
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/release-1.14 by this push:
 new 0132dd7  [FLINK-26382]Add Chinese documents for flink-training 
exercises (#46)
0132dd7 is described below

commit 0132dd7be8c881607f9a374613309493ade8c6dd
Author: T.C 
AuthorDate: Fri Apr 22 17:09:26 2022 +0800

[FLINK-26382]Add Chinese documents for flink-training exercises (#46)

Co-authored-by: Victor Xu 
---
 README.md   |   2 +
 README_zh.md| 274 
 hourly-tips/DISCUSSION.md   |   2 +
 hourly-tips/{DISCUSSION.md => DISCUSSION_zh.md} |  46 ++--
 hourly-tips/README.md   |   2 +
 hourly-tips/README_zh.md|  82 +++
 long-ride-alerts/DISCUSSION.md  |   2 +
 long-ride-alerts/DISCUSSION_zh.md   |  50 +
 long-ride-alerts/README.md  |   2 +
 long-ride-alerts/{README.md => README_zh.md}|  68 +++---
 ride-cleansing/README.md|   2 +
 ride-cleansing/{README.md => README_zh.md}  |  55 ++---
 rides-and-fares/README.md   |   2 +
 rides-and-fares/README_zh.md|  95 
 14 files changed, 600 insertions(+), 84 deletions(-)

diff --git a/README.md b/README.md
index 0fc84e1..b1a65cb 100644
--- a/README.md
+++ b/README.md
@@ -17,6 +17,8 @@ specific language governing permissions and limitations
 under the License.
 -->
 
+[中文版](./README_zh.md)
+
 # Apache Flink Training Exercises
 
 Exercises that accompany the training content in the documentation.
diff --git a/README_zh.md b/README_zh.md
new file mode 100644
index 000..2af9bba
--- /dev/null
+++ b/README_zh.md
@@ -0,0 +1,274 @@
+
+
+# Apache Flink 实践练习
+
+与文档中实践练习内容相关的练习。
+
+## 目录
+
+[**设置开发环境**](#set-up-your-development-environment)
+
+1. [软件要求](#software-requirements)
+1. [克隆并构建 flink-training 项目](#clone-and-build-the-flink-training-project)
+1. [将 flink-training 项目导入 
IDE](#import-the-flink-training-project-into-your-ide)
+
+[**使用出租车数据流(taxi data stream)**](#using-the-taxi-data-streams)
+
+1. [出租车车程(taxi ride)事件结构](#schema-of-taxi-ride-events)
+1. [出租车费用(taxi fare)事件结构](#schema-of-taxi-fare-events)
+
+[**如何做练习**](#how-to-do-the-lab-exercises)
+
+1. [了解数据](#learn-about-the-data)
+2. [在 IDE 中运行和调试 Flink 程序](#run-and-debug-flink-programs-in-your-ide)
+3. [练习、测试及解决方案](#exercises-tests-and-solutions)
+
+[**练习**](#lab-exercises)
+
+[**提交贡献**](#contributing)
+
+[**许可证**](#license)
+
+
+
+## 设置开发环境
+
+你需要设置便于进行开发、调试并运行实践练习的示例和解决方案的环境。
+
+
+
+### 软件要求
+
+Linux、OS X 和 Windows 均可作为 Flink 程序和本地执行的开发环境。 Flink 开发设置需要以下软件,它们应该安装在系统上:
+
+- Git
+- Java 8 或者 Java 11 版本的 JDK (JRE不满足要求;目前不支持其他版本的Java)
+- 支持 Gradle 的 Java (及/或 Scala) 开发IDE
+- 推荐使用 [IntelliJ](https://www.jetbrains.com/idea/), 但 
[Eclipse](https://www.eclipse.org/downloads/) 或 [Visual Studio 
Code](https://code.visualstudio.com/) (安装 [Java extension 
pack](https://code.visualstudio.com/docs/java/java-tutorial) 插件) 也可以用于Java环境
+- 为了使用 Scala, 需要使用 IntelliJ (及其 [Scala 
plugin](https://plugins.jetbrains.com/plugin/1347-scala/) 插件)
+
+> **:information_source: Windows 用户须知:** 实践说明中提供的 shell 命令示例适用于 UNIX 环境。
+> 您可能会发现值得在 Windows 环境中设置 cygwin 或 WSL。对于开发 Flink 
作业(jobs),Windows工作的相当好:可以在单机上运行 Flink 集群、提交作业、运行 webUI 并在IDE中执行作业。
+
+
+
+### 克隆并构建 flink-training 项目
+
+`flink-training` 仓库包含编程练习的习题、测试和参考解决方案。
+
+> **:information_source: 仓库格局:** 本仓库有几个分支,分别指向不同的 Apache Flink 版本,类似于 
[apache/flink](https://github.com/apache/flink) 仓库:
+> - 每个 Apache Flink 次要版本的发布分支,例如 `release-1.10`,和
+> - 一个指向当前 Flink 版本的 `master` 分支(不是 `flink:master`!)
+>
+> 如果想在当前 Flink 版本以外的版本上工作,请务必签出相应的分支。
+
+从 GitHub 克隆出 `flink-training` 仓库,导航到本地项目仓库并构建它:
+
+```bash
+git clone https://github.com/apache/flink-training.git
+cd flink-training
+./gradlew test shadowJar
+```
+
+如果是第一次构建,将会下载此 Flink 练习项目的所有依赖项。这通常需要几分钟时间,但具体取决于互联网连接速度。
+
+如果所有测试都通过并且构建成功,这说明你的实践练习已经开了一个好头。
+
+
+:cn: 中国用户: 点击这里了解如何使用本地 Maven 镜像。
+
+如果你在中国,我们建议将 Maven 存储库配置为使用镜像。 可以通过在 [`build.gradle`](build.gradle) 
文件中取消注释此部分来做到这一点:
+
+```groovy
+repositories {
+// for access from China, you may need to uncomment this line
+maven { url 'https://maven.aliyun.com/repository/public/' }
+mavenCentral()
+maven {
+url "https://repository.apache.org/content/repositories/snapshots/;
+mavenContent {
+snapshotsOnly()
+}
+}
+}
+```
+
+
+
+启用 Scala (可选)
+这个项目中的练习也可以使用 Scala ,但由于非 Scala 用户报告的一些问题,我们决定默认禁用 Scala。
+可以通过以下的方法修改 `gradle.properties` 文件以重新启用所有 Scala 练习和解决方案:
+
+[`gradle.properties`](gradle.properties) 文件如下:
+
+```properties
+#...
+
+# Scala exercises can be enabled by setting this to true
+

[flink-training] branch master updated: fixup! [FLINK-25313] Enable flink runtime web-ui (#45)

2022-04-20 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/master by this push:
 new 28aa5d8  fixup! [FLINK-25313] Enable flink runtime web-ui (#45)
28aa5d8 is described below

commit 28aa5d8114f7713097379ed02eade15dbe977e8e
Author: Nico Kruber 
AuthorDate: Wed Apr 20 12:51:17 2022 +0200

fixup! [FLINK-25313] Enable flink runtime web-ui (#45)
---
 build.gradle | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/build.gradle b/build.gradle
index dbce4f3..a9e4756 100644
--- a/build.gradle
+++ b/build.gradle
@@ -87,7 +87,7 @@ subprojects {
 shadow "org.apache.flink:flink-java:${flinkVersion}"
 shadow 
"org.apache.flink:flink-streaming-java_${scalaBinaryVersion}:${flinkVersion}"
 shadow 
"org.apache.flink:flink-streaming-scala_${scalaBinaryVersion}:${flinkVersion}"
-
+
 // allows using Flink's web UI when running in the IDE:
 shadow 
"org.apache.flink:flink-runtime-web_${scalaBinaryVersion}:${flinkVersion}"
 



[flink-training] branch master updated: [FLINK-25313] Enable flink runtime web-ui (#45)

2022-04-20 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/master by this push:
 new 9c25c54  [FLINK-25313] Enable flink runtime web-ui (#45)
9c25c54 is described below

commit 9c25c54a73578b24b15e9c52e28740460a687f93
Author: Junfan Zhang 
AuthorDate: Wed Apr 20 18:47:07 2022 +0800

[FLINK-25313] Enable flink runtime web-ui (#45)

* [FLINK-25313] Enable flink runtime web-ui when running in the IDE

With this, when running Flink applications locally, you can browse Flink's 
web UI to see more details.

Co-authored-by: Junfan Zhang 
---
 build.gradle | 3 +++
 1 file changed, 3 insertions(+)

diff --git a/build.gradle b/build.gradle
index 9395b08..dbce4f3 100644
--- a/build.gradle
+++ b/build.gradle
@@ -87,6 +87,9 @@ subprojects {
 shadow "org.apache.flink:flink-java:${flinkVersion}"
 shadow 
"org.apache.flink:flink-streaming-java_${scalaBinaryVersion}:${flinkVersion}"
 shadow 
"org.apache.flink:flink-streaming-scala_${scalaBinaryVersion}:${flinkVersion}"
+
+// allows using Flink's web UI when running in the IDE:
+shadow 
"org.apache.flink:flink-runtime-web_${scalaBinaryVersion}:${flinkVersion}"
 
 if (project != project(":common")) {
 implementation project(path: ':common')



[flink] branch master updated (0bc2234 -> 7bbf368)

2022-01-11 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 0bc2234  [FLINK-25266][e2e] Convert StreamingKafkaITCase to 
SmokeKafkaITCase covering application packaging
 new e9dba5e  [FLINK-25362][docs] allow SQL maven artifact list
 new dccbb87  [FLINK-25362][docs] fix the Avro Format maven dependency
 new 7bbf368  [FLINK-25362][docs] fix maven instructions for "Confluent 
Avro Format"

The 3 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../docs/connectors/table/formats/avro-confluent.md   |  2 ++
 docs/content/docs/connectors/table/formats/avro-confluent.md  |  2 ++
 docs/data/sql_connectors.yml  |  6 --
 docs/layouts/shortcodes/sql_download_table.html   | 11 ---
 4 files changed, 16 insertions(+), 5 deletions(-)


[flink] 03/03: [FLINK-25362][docs] fix maven instructions for "Confluent Avro Format"

2022-01-11 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch release-1.13
in repository https://gitbox.apache.org/repos/asf/flink.git

commit e3d06b8807fd736a9ae3706c7c5bc4ef4336cd51
Author: Nico Kruber 
AuthorDate: Fri Dec 17 11:50:06 2021 +0100

[FLINK-25362][docs] fix maven instructions for "Confluent Avro Format"
---
 docs/content.zh/docs/connectors/table/formats/avro-confluent.md | 2 ++
 docs/content/docs/connectors/table/formats/avro-confluent.md| 2 ++
 docs/data/sql_connectors.yml| 4 +++-
 3 files changed, 7 insertions(+), 1 deletion(-)

diff --git a/docs/content.zh/docs/connectors/table/formats/avro-confluent.md 
b/docs/content.zh/docs/connectors/table/formats/avro-confluent.md
index 9ff6aba..b8ca74a 100644
--- a/docs/content.zh/docs/connectors/table/formats/avro-confluent.md
+++ b/docs/content.zh/docs/connectors/table/formats/avro-confluent.md
@@ -44,6 +44,8 @@ Avro Schema Registry 格式只能与 [Apache Kafka SQL 连接器]({{< ref 
"docs/
 
 {{< sql_download_table "avro-confluent" >}}
 
+For Maven, SBT, Gradle, or other build automation tools, please also ensure 
that Confluent's maven repository at `https://packages.confluent.io/maven/` is 
configured in your project's build files.
+
 如何创建使用 Avro-Confluent 格式的表
 
 
diff --git a/docs/content/docs/connectors/table/formats/avro-confluent.md 
b/docs/content/docs/connectors/table/formats/avro-confluent.md
index d8d9224..518baff 100644
--- a/docs/content/docs/connectors/table/formats/avro-confluent.md
+++ b/docs/content/docs/connectors/table/formats/avro-confluent.md
@@ -42,6 +42,8 @@ Dependencies
 
 {{< sql_download_table "avro-confluent" >}}
 
+For Maven, SBT, Gradle, or other build automation tools, please also ensure 
that Confluent's maven repository at `https://packages.confluent.io/maven/` is 
configured in your project's build files.
+
 How to create tables with Avro-Confluent format
 --
 
diff --git a/docs/data/sql_connectors.yml b/docs/data/sql_connectors.yml
index 44d92ff..63f52421 100644
--- a/docs/data/sql_connectors.yml
+++ b/docs/data/sql_connectors.yml
@@ -44,7 +44,9 @@ avro:
 
 avro-confluent:
 name: Avro Schema Registry
-maven: flink-avro-confluent-registry
+maven:
+  - flink-avro-confluent-registry
+  - flink-avro
 category: format
 sql_url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-avro-confluent-registry/$version/flink-sql-avro-confluent-registry-$version.jar
 


[flink] 02/03: [FLINK-25362][docs] fix the Avro Format maven dependency

2022-01-11 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch release-1.13
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 06ab1ecbae141abfc4f97edf13fc9eba377efe9f
Author: Nico Kruber 
AuthorDate: Fri Dec 17 11:38:15 2021 +0100

[FLINK-25362][docs] fix the Avro Format maven dependency

It was falsely pointing to the shaded sql artifact.
---
 docs/data/sql_connectors.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/data/sql_connectors.yml b/docs/data/sql_connectors.yml
index 989b5f2..44d92ff 100644
--- a/docs/data/sql_connectors.yml
+++ b/docs/data/sql_connectors.yml
@@ -38,7 +38,7 @@
 
 avro:
 name: Avro
-maven: flink-sql-avro
+maven: flink-avro
 category: format
 sql_url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-avro/$version/flink-sql-avro-$version.jar
 


[flink] 01/03: [FLINK-25362][docs] allow SQL maven artifact list

2022-01-11 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch release-1.13
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 327113d26e80cfe3d9adf759473ce78374df47fd
Author: Nico Kruber 
AuthorDate: Fri Dec 17 11:36:07 2021 +0100

[FLINK-25362][docs] allow SQL maven artifact list

If a connector needs more than one maven dependency, these can instead be
defined as a list/array now. The old syntax with a single string is also 
still
possible.
---
 docs/layouts/shortcodes/sql_download_table.html | 11 ---
 1 file changed, 8 insertions(+), 3 deletions(-)

diff --git a/docs/layouts/shortcodes/sql_download_table.html 
b/docs/layouts/shortcodes/sql_download_table.html
index 3c348c7e..0952e06 100644
--- a/docs/layouts/shortcodes/sql_download_table.html
+++ b/docs/layouts/shortcodes/sql_download_table.html
@@ -89,11 +89,16 @@ and SQL Client with SQL JAR bundles.
   rendered documentation. 
 */}}
 {{ define "maven-artifact" }}{{/* (dict "ArtifactId" .ArtifactId */}}
-
+
+{{- $artifacts := cond (eq (printf "%T" .ArtifactId) "string") (slice 
.ArtifactId) .ArtifactId -}}
+{{- range $artifact := $artifacts -}}
+  
   org.apache.flink/groupId
-  {{- partial "docs/interpolate" 
.ArtifactId -}}/artifactId
+  {{- partial "docs/interpolate" 
$artifact -}}/artifactId
   {{- site.Params.Version -}}/version
-/dependency
+/dependency
+{{- end -}}
+
 
 Copied to clipboard!
 


[flink] branch release-1.13 updated (86a6e2b -> e3d06b8)

2022-01-11 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch release-1.13
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 86a6e2b  [FLINK-25307][test] Print the curl logs is querying 
dispatcher startup failed
 new 327113d  [FLINK-25362][docs] allow SQL maven artifact list
 new 06ab1ec  [FLINK-25362][docs] fix the Avro Format maven dependency
 new e3d06b8  [FLINK-25362][docs] fix maven instructions for "Confluent 
Avro Format"

The 3 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../docs/connectors/table/formats/avro-confluent.md   |  2 ++
 docs/content/docs/connectors/table/formats/avro-confluent.md  |  2 ++
 docs/data/sql_connectors.yml  |  6 --
 docs/layouts/shortcodes/sql_download_table.html   | 11 ---
 4 files changed, 16 insertions(+), 5 deletions(-)


[flink] 02/03: [FLINK-25362][docs] fix the Avro Format maven dependency

2022-01-11 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch release-1.14
in repository https://gitbox.apache.org/repos/asf/flink.git

commit eca3aabc1691f3abb604994f06153ddf84a132d7
Author: Nico Kruber 
AuthorDate: Fri Dec 17 11:38:15 2021 +0100

[FLINK-25362][docs] fix the Avro Format maven dependency

It was falsely pointing to the shaded sql artifact.
---
 docs/data/sql_connectors.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/data/sql_connectors.yml b/docs/data/sql_connectors.yml
index 989b5f2..44d92ff 100644
--- a/docs/data/sql_connectors.yml
+++ b/docs/data/sql_connectors.yml
@@ -38,7 +38,7 @@
 
 avro:
 name: Avro
-maven: flink-sql-avro
+maven: flink-avro
 category: format
 sql_url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-avro/$version/flink-sql-avro-$version.jar
 


[flink] branch release-1.14 updated (ae8a4b7 -> 7b24c90)

2022-01-11 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch release-1.14
in repository https://gitbox.apache.org/repos/asf/flink.git.


from ae8a4b7  [FLINK-25307][test] Print the curl logs is querying 
dispatcher startup failed
 new d8b6f89  [FLINK-25362][docs] allow SQL maven artifact list
 new eca3aab  [FLINK-25362][docs] fix the Avro Format maven dependency
 new 7b24c90  [FLINK-25362][docs] fix maven instructions for "Confluent 
Avro Format"

The 3 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../docs/connectors/table/formats/avro-confluent.md   |  2 ++
 docs/content/docs/connectors/table/formats/avro-confluent.md  |  2 ++
 docs/data/sql_connectors.yml  |  6 --
 docs/layouts/shortcodes/sql_download_table.html   | 11 ---
 4 files changed, 16 insertions(+), 5 deletions(-)


[flink] 03/03: [FLINK-25362][docs] fix maven instructions for "Confluent Avro Format"

2022-01-11 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch release-1.14
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 7b24c90c125d4eaf8e97e6be8073cf8ee83d2290
Author: Nico Kruber 
AuthorDate: Fri Dec 17 11:50:06 2021 +0100

[FLINK-25362][docs] fix maven instructions for "Confluent Avro Format"
---
 docs/content.zh/docs/connectors/table/formats/avro-confluent.md | 2 ++
 docs/content/docs/connectors/table/formats/avro-confluent.md| 2 ++
 docs/data/sql_connectors.yml| 4 +++-
 3 files changed, 7 insertions(+), 1 deletion(-)

diff --git a/docs/content.zh/docs/connectors/table/formats/avro-confluent.md 
b/docs/content.zh/docs/connectors/table/formats/avro-confluent.md
index 61248d1..ddb3c08 100644
--- a/docs/content.zh/docs/connectors/table/formats/avro-confluent.md
+++ b/docs/content.zh/docs/connectors/table/formats/avro-confluent.md
@@ -44,6 +44,8 @@ Avro Schema Registry 格式只能与 [Apache Kafka SQL 连接器]({{< ref 
"docs/
 
 {{< sql_download_table "avro-confluent" >}}
 
+For Maven, SBT, Gradle, or other build automation tools, please also ensure 
that Confluent's maven repository at `https://packages.confluent.io/maven/` is 
configured in your project's build files.
+
 如何创建使用 Avro-Confluent 格式的表
 
 
diff --git a/docs/content/docs/connectors/table/formats/avro-confluent.md 
b/docs/content/docs/connectors/table/formats/avro-confluent.md
index 7ab4c98..cf2fe70 100644
--- a/docs/content/docs/connectors/table/formats/avro-confluent.md
+++ b/docs/content/docs/connectors/table/formats/avro-confluent.md
@@ -42,6 +42,8 @@ Dependencies
 
 {{< sql_download_table "avro-confluent" >}}
 
+For Maven, SBT, Gradle, or other build automation tools, please also ensure 
that Confluent's maven repository at `https://packages.confluent.io/maven/` is 
configured in your project's build files.
+
 How to create tables with Avro-Confluent format
 --
 
diff --git a/docs/data/sql_connectors.yml b/docs/data/sql_connectors.yml
index 44d92ff..63f52421 100644
--- a/docs/data/sql_connectors.yml
+++ b/docs/data/sql_connectors.yml
@@ -44,7 +44,9 @@ avro:
 
 avro-confluent:
 name: Avro Schema Registry
-maven: flink-avro-confluent-registry
+maven:
+  - flink-avro-confluent-registry
+  - flink-avro
 category: format
 sql_url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-avro-confluent-registry/$version/flink-sql-avro-confluent-registry-$version.jar
 


[flink] 01/03: [FLINK-25362][docs] allow SQL maven artifact list

2022-01-11 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch release-1.14
in repository https://gitbox.apache.org/repos/asf/flink.git

commit d8b6f896c424e92bf020abc18f532cf02e99bdc7
Author: Nico Kruber 
AuthorDate: Fri Dec 17 11:36:07 2021 +0100

[FLINK-25362][docs] allow SQL maven artifact list

If a connector needs more than one maven dependency, these can instead be
defined as a list/array now. The old syntax with a single string is also 
still
possible.
---
 docs/layouts/shortcodes/sql_download_table.html | 11 ---
 1 file changed, 8 insertions(+), 3 deletions(-)

diff --git a/docs/layouts/shortcodes/sql_download_table.html 
b/docs/layouts/shortcodes/sql_download_table.html
index 3c348c7e..0952e06 100644
--- a/docs/layouts/shortcodes/sql_download_table.html
+++ b/docs/layouts/shortcodes/sql_download_table.html
@@ -89,11 +89,16 @@ and SQL Client with SQL JAR bundles.
   rendered documentation. 
 */}}
 {{ define "maven-artifact" }}{{/* (dict "ArtifactId" .ArtifactId */}}
-
+
+{{- $artifacts := cond (eq (printf "%T" .ArtifactId) "string") (slice 
.ArtifactId) .ArtifactId -}}
+{{- range $artifact := $artifacts -}}
+  
   org.apache.flink/groupId
-  {{- partial "docs/interpolate" 
.ArtifactId -}}/artifactId
+  {{- partial "docs/interpolate" 
$artifact -}}/artifactId
   {{- site.Params.Version -}}/version
-/dependency
+/dependency
+{{- end -}}
+
 
 Copied to clipboard!
 


[flink] 02/03: [FLINK-25362][docs] fix the Avro Format maven dependency

2022-01-11 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit dccbb87462b9188472d2b93e301774ca4716ce4d
Author: Nico Kruber 
AuthorDate: Fri Dec 17 11:38:15 2021 +0100

[FLINK-25362][docs] fix the Avro Format maven dependency

It was falsely pointing to the shaded sql artifact.
---
 docs/data/sql_connectors.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/data/sql_connectors.yml b/docs/data/sql_connectors.yml
index b3e6d53..31e68ee 100644
--- a/docs/data/sql_connectors.yml
+++ b/docs/data/sql_connectors.yml
@@ -38,7 +38,7 @@
 
 avro:
 name: Avro
-maven: flink-sql-avro
+maven: flink-avro
 category: format
 sql_url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-avro/$version/flink-sql-avro-$version.jar
 


[flink] 03/03: [FLINK-25362][docs] fix maven instructions for "Confluent Avro Format"

2022-01-11 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 7bbf368bdfbf518245a659cc64ee8a8661ba4e8c
Author: Nico Kruber 
AuthorDate: Fri Dec 17 11:50:06 2021 +0100

[FLINK-25362][docs] fix maven instructions for "Confluent Avro Format"
---
 docs/content.zh/docs/connectors/table/formats/avro-confluent.md | 2 ++
 docs/content/docs/connectors/table/formats/avro-confluent.md| 2 ++
 docs/data/sql_connectors.yml| 4 +++-
 3 files changed, 7 insertions(+), 1 deletion(-)

diff --git a/docs/content.zh/docs/connectors/table/formats/avro-confluent.md 
b/docs/content.zh/docs/connectors/table/formats/avro-confluent.md
index 61248d1..ddb3c08 100644
--- a/docs/content.zh/docs/connectors/table/formats/avro-confluent.md
+++ b/docs/content.zh/docs/connectors/table/formats/avro-confluent.md
@@ -44,6 +44,8 @@ Avro Schema Registry 格式只能与 [Apache Kafka SQL 连接器]({{< ref 
"docs/
 
 {{< sql_download_table "avro-confluent" >}}
 
+For Maven, SBT, Gradle, or other build automation tools, please also ensure 
that Confluent's maven repository at `https://packages.confluent.io/maven/` is 
configured in your project's build files.
+
 如何创建使用 Avro-Confluent 格式的表
 
 
diff --git a/docs/content/docs/connectors/table/formats/avro-confluent.md 
b/docs/content/docs/connectors/table/formats/avro-confluent.md
index 7ab4c98..cf2fe70 100644
--- a/docs/content/docs/connectors/table/formats/avro-confluent.md
+++ b/docs/content/docs/connectors/table/formats/avro-confluent.md
@@ -42,6 +42,8 @@ Dependencies
 
 {{< sql_download_table "avro-confluent" >}}
 
+For Maven, SBT, Gradle, or other build automation tools, please also ensure 
that Confluent's maven repository at `https://packages.confluent.io/maven/` is 
configured in your project's build files.
+
 How to create tables with Avro-Confluent format
 --
 
diff --git a/docs/data/sql_connectors.yml b/docs/data/sql_connectors.yml
index 31e68ee..003828f 100644
--- a/docs/data/sql_connectors.yml
+++ b/docs/data/sql_connectors.yml
@@ -44,7 +44,9 @@ avro:
 
 avro-confluent:
 name: Avro Schema Registry
-maven: flink-avro-confluent-registry
+maven:
+  - flink-avro-confluent-registry
+  - flink-avro
 category: format
 sql_url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-avro-confluent-registry/$version/flink-sql-avro-confluent-registry-$version.jar
 


[flink] 01/03: [FLINK-25362][docs] allow SQL maven artifact list

2022-01-11 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit e9dba5e81f353fc6795b03add39a6d299e4c7aa5
Author: Nico Kruber 
AuthorDate: Fri Dec 17 11:36:07 2021 +0100

[FLINK-25362][docs] allow SQL maven artifact list

If a connector needs more than one maven dependency, these can instead be
defined as a list/array now. The old syntax with a single string is also 
still
possible.
---
 docs/layouts/shortcodes/sql_download_table.html | 11 ---
 1 file changed, 8 insertions(+), 3 deletions(-)

diff --git a/docs/layouts/shortcodes/sql_download_table.html 
b/docs/layouts/shortcodes/sql_download_table.html
index 3c348c7e..0952e06 100644
--- a/docs/layouts/shortcodes/sql_download_table.html
+++ b/docs/layouts/shortcodes/sql_download_table.html
@@ -89,11 +89,16 @@ and SQL Client with SQL JAR bundles.
   rendered documentation. 
 */}}
 {{ define "maven-artifact" }}{{/* (dict "ArtifactId" .ArtifactId */}}
-
+
+{{- $artifacts := cond (eq (printf "%T" .ArtifactId) "string") (slice 
.ArtifactId) .ArtifactId -}}
+{{- range $artifact := $artifacts -}}
+  
   org.apache.flink/groupId
-  {{- partial "docs/interpolate" 
.ArtifactId -}}/artifactId
+  {{- partial "docs/interpolate" 
$artifact -}}/artifactId
   {{- site.Params.Version -}}/version
-/dependency
+/dependency
+{{- end -}}
+
 
 Copied to clipboard!
 


[flink-training] branch master updated: [hotfix] Fix links to old documentation

2021-11-08 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/master by this push:
 new e9e5adf  [hotfix] Fix links to old documentation
e9e5adf is described below

commit e9e5adfab00e20335142fbf3a6603db16f396432
Author: Ali Bahadir Zeybek 
AuthorDate: Thu Oct 21 15:56:30 2021 +0200

[hotfix] Fix links to old documentation
---
 README.md  | 2 +-
 hourly-tips/README.md  | 4 ++--
 long-ride-alerts/DISCUSSION.md | 2 +-
 long-ride-alerts/README.md | 4 ++--
 ride-cleansing/README.md   | 4 ++--
 rides-and-fares/README.md  | 2 +-
 6 files changed, 9 insertions(+), 9 deletions(-)

diff --git a/README.md b/README.md
index 34ded7b..0fc84e1 100644
--- a/README.md
+++ b/README.md
@@ -132,7 +132,7 @@ Then you should be able to open 
[`RideCleansingTest`](ride-cleansing/src/test/ja
 
 > **:information_source: Note for Scala users:** You will need to use IntelliJ 
 > with the JetBrains Scala plugin, and you will need to add a Scala 2.12 SDK 
 > to the Global Libraries section of the Project Structure as well as to the 
 > module you are working on.
 > IntelliJ will ask you for the latter when you open a Scala file.
-> Please note that Scala 2.12.8 and above are not supported (see [Flink Scala 
Versions](https://ci.apache.org/projects/flink/flink-docs-stable/docs/dev/datastream/project-configuration/#scala-versions
 for details))!
+> Please note that Scala 2.12.8 and above are not supported (see [Flink Scala 
Versions](https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/datastream/project-configuration/#scala-versions
 for details))!
 
 ## Use the taxi data streams
 
diff --git a/hourly-tips/README.md b/hourly-tips/README.md
index 142edec..a49a4c5 100644
--- a/hourly-tips/README.md
+++ b/hourly-tips/README.md
@@ -59,8 +59,8 @@ Note that it is possible to cascade one set of time windows 
after another, so lo
 
 ## Documentation
 
-- 
[Windows](https://ci.apache.org/projects/flink/flink-docs-stable/dev/stream/operators/windows.html)
-- [See the section on aggregations on 
windows](https://ci.apache.org/projects/flink/flink-docs-stable/dev/stream/operators/#datastream-transformations)
+- 
[Windows](https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/datastream/operators/windows)
+- [See the section on aggregations on 
windows](https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/datastream/operators/overview/#datastream-transformations)
 
 ## Reference Solutions
 
diff --git a/long-ride-alerts/DISCUSSION.md b/long-ride-alerts/DISCUSSION.md
index a91e7fc..143b5e2 100644
--- a/long-ride-alerts/DISCUSSION.md
+++ b/long-ride-alerts/DISCUSSION.md
@@ -31,7 +31,7 @@ These cases are worth noting:
 event will be stored in state indefinitely (this is another leak!).
 
 These leaks could be addressed by either
-using [state 
TTL](https://ci.apache.org/projects/flink/flink-docs-stable/dev/stream/state/state.html#state-time-to-live-ttl),
+using [state 
TTL](https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/datastream/fault-tolerance/state/#state-time-to-live-ttl),
 or another timer, to eventually clear any lingering state.
 
 ### Bottom line
diff --git a/long-ride-alerts/README.md b/long-ride-alerts/README.md
index b95e261..07bc566 100644
--- a/long-ride-alerts/README.md
+++ b/long-ride-alerts/README.md
@@ -76,8 +76,8 @@ The challenge is figuring out what state and timers to use, 
and when to set and
 
 ## Documentation
 
-- 
[ProcessFunction](https://ci.apache.org/projects/flink/flink-docs-stable/dev/stream/operators/process_function.html)
-- [Working with 
State](https://ci.apache.org/projects/flink/flink-docs-stable/dev/stream/state/index.html)
+- 
[ProcessFunction](https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/datastream/operators/process_function)
+- [Working with 
State](https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/datastream/fault-tolerance/state)
 
 ## After you've completed the exercise
 
diff --git a/ride-cleansing/README.md b/ride-cleansing/README.md
index ee76a47..254291b 100644
--- a/ride-cleansing/README.md
+++ b/ride-cleansing/README.md
@@ -80,8 +80,8 @@ Flink's DataStream API features a 
`DataStream.filter(FilterFunction)` transforma
 
 ## Documentation
 
-- [DataStream 
API](https://ci.apache.org/projects/flink/flink-docs-stable/dev/datastream_api.html)
-- [Flink 
JavaDocs](https://ci.apache.org/projects/flink/flink-docs-stable/api/java/)
+- [DataStream 
API](https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/datastream/overview)
+- [Flink 
JavaDocs](https://nightlies.apache.org/flink/flink-docs-stable/api/java)
 
 ## Reference Solutions
 
diff --git a/rides-and-fares/README.md b/rides-and-fares/README.md
index b5ff381..2172513 100644
--- a/rides-and-fares/README.md
+++ b/rides-and-fares/REA

[flink-web] branch asf-site updated (fd4ceb1 -> 0ea0f00)

2021-10-04 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git.


from fd4ceb1  Update buffer_debloating chart once more time
 add acc8158  [hotfix] minor fixes and improvements in the 1.14 release post
 add 0ea0f00  Rebuild website

No new revisions were added by this update.

Summary of changes:
 _posts/2021-09-29-release-1.14.0.md | 25 -
 content/blog/feed.xml   | 23 +++
 content/news/2021/09/29/release-1.14.0.html | 23 +++
 3 files changed, 34 insertions(+), 37 deletions(-)


[flink-training] branch master updated: [hotfix] remove extraneous files for non-existing modules

2021-10-04 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/master by this push:
 new 1d922de  [hotfix] remove extraneous files for non-existing modules
1d922de is described below

commit 1d922de3185c41df05129a5faab69e74fc5cecec
Author: Nico Kruber 
AuthorDate: Mon Oct 4 13:29:19 2021 +0200

[hotfix] remove extraneous files for non-existing modules
---
 .../src/main/resources/log4j2.properties   | 24 --
 .../src/main/resources/log4j2.properties   | 24 --
 .../src/main/resources/log4j2.properties   | 24 --
 .../src/main/resources/log4j2.properties   | 24 --
 .../exercise/src/main/resources/log4j2.properties  | 24 --
 .../src/main/resources/log4j2.properties   | 24 --
 .../solution/src/main/resources/log4j2.properties  | 24 --
 .../src/main/resources/log4j2.properties   | 24 --
 8 files changed, 192 deletions(-)

diff --git a/troubleshooting/checkpointing/src/main/resources/log4j2.properties 
b/troubleshooting/checkpointing/src/main/resources/log4j2.properties
deleted file mode 100644
index 8319d24..000
--- a/troubleshooting/checkpointing/src/main/resources/log4j2.properties
+++ /dev/null
@@ -1,24 +0,0 @@
-
-#  Licensed to the Apache Software Foundation (ASF) under one
-#  or more contributor license agreements.  See the NOTICE file
-#  distributed with this work for additional information
-#  regarding copyright ownership.  The ASF licenses this file
-#  to you under the Apache License, Version 2.0 (the
-#  "License"); you may not use this file except in compliance
-#  with the License.  You may obtain a copy of the License at
-#
-#  http://www.apache.org/licenses/LICENSE-2.0
-#
-#  Unless required by applicable law or agreed to in writing, software
-#  distributed under the License is distributed on an "AS IS" BASIS,
-#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-#  See the License for the specific language governing permissions and
-# limitations under the License.
-
-loogers=rootLooger
-appender.console.type=Console
-appender.console.name=STDOUT
-appender.console.layout.type=PatternLayout
-appender.console.layout.pattern=%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - 
%msg%n
-rootLogger.level=INFO
-rootLogger.appenderRef.console.ref=STDOUT
\ No newline at end of file
diff --git 
a/troubleshooting/external-enrichment/src/main/resources/log4j2.properties 
b/troubleshooting/external-enrichment/src/main/resources/log4j2.properties
deleted file mode 100644
index 8319d24..000
--- a/troubleshooting/external-enrichment/src/main/resources/log4j2.properties
+++ /dev/null
@@ -1,24 +0,0 @@
-
-#  Licensed to the Apache Software Foundation (ASF) under one
-#  or more contributor license agreements.  See the NOTICE file
-#  distributed with this work for additional information
-#  regarding copyright ownership.  The ASF licenses this file
-#  to you under the Apache License, Version 2.0 (the
-#  "License"); you may not use this file except in compliance
-#  with the License.  You may obtain a copy of the License at
-#
-#  http://www.apache.org/licenses/LICENSE-2.0
-#
-#  Unless required by applicable law or agreed to in writing, software
-#  distributed under the License is distributed on an "AS IS" BASIS,
-#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-#  See the License for the specific language governing permissions and
-# limitations under the License.
-
-loogers=rootLooger
-appender.console.type=Console
-appender.console.name=STDOUT
-appender.console.layout.type=PatternLayout
-appender.console.layout.pattern=%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - 
%msg%n
-rootLogger.level=INFO
-rootLogger.appenderRef.console.ref=STDOUT
\ No newline at end of file
diff --git a/troubleshooting/introduction/src/main/resources/log4j2.properties 
b/troubleshooting/introduction/src/main/resources/log4j2.properties
deleted file mode 100644
index 8319d24..000
--- a/troubleshooting/introduction/src/main/resources/log4j2.properties
+++ /dev/null
@@ -1,24 +0,0 @@
-
-#  Licensed to the Apache Software Foundation (ASF) under one
-#  or more contributor license agreements.  See the NOTICE file
-#  distributed with this work for additional infor

[flink] branch release-1.14 updated: [hotfix][docs] fix link from upsert-kafka to kafka Table(!) connector

2021-09-20 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch release-1.14
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.14 by this push:
 new 026b0cb  [hotfix][docs] fix link from upsert-kafka to kafka Table(!) 
connector
026b0cb is described below

commit 026b0cbbb711a1638303649e4b139d4aa4722a0e
Author: Nico Kruber 
AuthorDate: Fri Sep 17 17:30:22 2021 +0200

[hotfix][docs] fix link from upsert-kafka to kafka Table(!) connector
---
 docs/content.zh/docs/connectors/table/upsert-kafka.md | 2 +-
 docs/content/docs/connectors/table/upsert-kafka.md| 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/content.zh/docs/connectors/table/upsert-kafka.md 
b/docs/content.zh/docs/connectors/table/upsert-kafka.md
index da2a935..298a5f9 100644
--- a/docs/content.zh/docs/connectors/table/upsert-kafka.md
+++ b/docs/content.zh/docs/connectors/table/upsert-kafka.md
@@ -88,7 +88,7 @@ GROUP BY user_region;
 Available Metadata
 --
 
-See the [regular Kafka connector]({{< ref "docs/connectors/datastream/kafka" 
>}}#available-metadata) for a list
+See the [regular Kafka connector]({{< ref "docs/connectors/table/kafka" 
>}}#available-metadata) for a list
 of all available metadata fields.
 
 连接器参数
diff --git a/docs/content/docs/connectors/table/upsert-kafka.md 
b/docs/content/docs/connectors/table/upsert-kafka.md
index 03d87da..4093864 100644
--- a/docs/content/docs/connectors/table/upsert-kafka.md
+++ b/docs/content/docs/connectors/table/upsert-kafka.md
@@ -97,7 +97,7 @@ GROUP BY user_region;
 Available Metadata
 --
 
-See the [regular Kafka connector]({{< ref "docs/connectors/datastream/kafka" 
>}}#available-metadata) for a list
+See the [regular Kafka connector]({{< ref "docs/connectors/table/kafka" 
>}}#available-metadata) for a list
 of all available metadata fields.
 
 Connector Options


[flink] branch release-1.12 updated: [hotfix][docs] fix link from upsert-kafka to kafka Table(!) connector

2021-09-20 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch release-1.12
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.12 by this push:
 new 44e629e  [hotfix][docs] fix link from upsert-kafka to kafka Table(!) 
connector
44e629e is described below

commit 44e629e5f80d6b03d6c524e6923b5bc099dcbdc2
Author: Nico Kruber 
AuthorDate: Fri Sep 17 17:30:22 2021 +0200

[hotfix][docs] fix link from upsert-kafka to kafka Table(!) connector
---
 docs/dev/table/connectors/upsert-kafka.md| 2 +-
 docs/dev/table/connectors/upsert-kafka.zh.md | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/dev/table/connectors/upsert-kafka.md 
b/docs/dev/table/connectors/upsert-kafka.md
index 88879db..78be4ef 100644
--- a/docs/dev/table/connectors/upsert-kafka.md
+++ b/docs/dev/table/connectors/upsert-kafka.md
@@ -103,7 +103,7 @@ GROUP BY user_region;
 Available Metadata
 --
 
-See the [regular Kafka connector]({% link dev/connectors/kafka.md 
%}#available-metadata) for a list
+See the [regular Kafka connector]({% link dev/table/connectors/kafka.md 
%}#available-metadata) for a list-kafka.md
 of all available metadata fields.
 
 Connector Options
diff --git a/docs/dev/table/connectors/upsert-kafka.zh.md 
b/docs/dev/table/connectors/upsert-kafka.zh.md
index c862340..c0f56c5 100644
--- a/docs/dev/table/connectors/upsert-kafka.zh.md
+++ b/docs/dev/table/connectors/upsert-kafka.zh.md
@@ -94,7 +94,7 @@ GROUP BY user_region;
 Available Metadata
 --
 
-See the [regular Kafka connector]({% link dev/connectors/kafka.zh.md 
%}#available-metadata) for a list
+See the [regular Kafka connector]({% link dev/table/connectors/kafka.zh.md 
%}#available-metadata) for a list-kafka.md
 of all available metadata fields.
 
 连接器参数


[flink] branch release-1.13 updated: [hotfix][docs] fix link from upsert-kafka to kafka Table(!) connector

2021-09-20 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch release-1.13
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.13 by this push:
 new 1feff25  [hotfix][docs] fix link from upsert-kafka to kafka Table(!) 
connector
1feff25 is described below

commit 1feff2591286002ee12ef412b7f7bec760ff82a6
Author: Nico Kruber 
AuthorDate: Fri Sep 17 17:30:22 2021 +0200

[hotfix][docs] fix link from upsert-kafka to kafka Table(!) connector
---
 docs/content.zh/docs/connectors/table/upsert-kafka.md | 2 +-
 docs/content/docs/connectors/table/upsert-kafka.md| 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/content.zh/docs/connectors/table/upsert-kafka.md 
b/docs/content.zh/docs/connectors/table/upsert-kafka.md
index d269e67..d9832f9 100644
--- a/docs/content.zh/docs/connectors/table/upsert-kafka.md
+++ b/docs/content.zh/docs/connectors/table/upsert-kafka.md
@@ -88,7 +88,7 @@ GROUP BY user_region;
 Available Metadata
 --
 
-See the [regular Kafka connector]({{< ref "docs/connectors/datastream/kafka" 
>}}#available-metadata) for a list
+See the [regular Kafka connector]({{< ref "docs/connectors/table/kafka" 
>}}#available-metadata) for a list
 of all available metadata fields.
 
 连接器参数
diff --git a/docs/content/docs/connectors/table/upsert-kafka.md 
b/docs/content/docs/connectors/table/upsert-kafka.md
index 03d87da..4093864 100644
--- a/docs/content/docs/connectors/table/upsert-kafka.md
+++ b/docs/content/docs/connectors/table/upsert-kafka.md
@@ -97,7 +97,7 @@ GROUP BY user_region;
 Available Metadata
 --
 
-See the [regular Kafka connector]({{< ref "docs/connectors/datastream/kafka" 
>}}#available-metadata) for a list
+See the [regular Kafka connector]({{< ref "docs/connectors/table/kafka" 
>}}#available-metadata) for a list
 of all available metadata fields.
 
 Connector Options


[flink] branch master updated (be1da48 -> 6e0bd35)

2021-09-20 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from be1da48  [FLINK-22944][state] Re-use output in StateChangeLogger
 add 6e0bd35  [hotfix][docs] fix link from upsert-kafka to kafka Table(!) 
connector

No new revisions were added by this update.

Summary of changes:
 docs/content.zh/docs/connectors/table/upsert-kafka.md | 2 +-
 docs/content/docs/connectors/table/upsert-kafka.md| 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)


[flink-training] branch master updated: [FLINK-24118] Allow TaxiFareGenerator to produce bounded streams (#40)

2021-09-03 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/master by this push:
 new bfcd25a  [FLINK-24118] Allow TaxiFareGenerator to produce bounded 
streams (#40)
bfcd25a is described below

commit bfcd25a9d52e71f018720ea4865090b5bab5b135
Author: David Anderson 
AuthorDate: Fri Sep 3 08:36:41 2021 -0600

[FLINK-24118] Allow TaxiFareGenerator to produce bounded streams (#40)
---
 .../exercises/common/sources/TaxiFareGenerator.java   | 19 ++-
 .../exercises/common/utils/DataGenerator.java |  4 ++--
 .../training/exercises/hourlytips/HourlyTipsTest.java |  8 
 3 files changed, 24 insertions(+), 7 deletions(-)

diff --git 
a/common/src/main/java/org/apache/flink/training/exercises/common/sources/TaxiFareGenerator.java
 
b/common/src/main/java/org/apache/flink/training/exercises/common/sources/TaxiFareGenerator.java
index 0866e0d..58fbe68 100644
--- 
a/common/src/main/java/org/apache/flink/training/exercises/common/sources/TaxiFareGenerator.java
+++ 
b/common/src/main/java/org/apache/flink/training/exercises/common/sources/TaxiFareGenerator.java
@@ -20,6 +20,10 @@ package org.apache.flink.training.exercises.common.sources;
 
 import org.apache.flink.streaming.api.functions.source.SourceFunction;
 import org.apache.flink.training.exercises.common.datatypes.TaxiFare;
+import org.apache.flink.training.exercises.common.utils.DataGenerator;
+
+import java.time.Duration;
+import java.time.Instant;
 
 /**
  * This SourceFunction generates a data stream of TaxiFare records.
@@ -29,6 +33,14 @@ import 
org.apache.flink.training.exercises.common.datatypes.TaxiFare;
 public class TaxiFareGenerator implements SourceFunction {
 
 private volatile boolean running = true;
+private Instant limitingTimestamp = Instant.MAX;
+
+/** Create a bounded TaxiFareGenerator that runs only for the specified 
duration. */
+public static TaxiFareGenerator runFor(Duration duration) {
+TaxiFareGenerator generator = new TaxiFareGenerator();
+generator.limitingTimestamp = DataGenerator.BEGINNING.plus(duration);
+return generator;
+}
 
 @Override
 public void run(SourceContext ctx) throws Exception {
@@ -37,8 +49,13 @@ public class TaxiFareGenerator implements 
SourceFunction {
 
 while (running) {
 TaxiFare fare = new TaxiFare(id);
-id += 1;
 
+// don't emit events that exceed the specified limit
+if (fare.startTime.compareTo(limitingTimestamp) >= 0) {
+break;
+}
+
+++id;
 ctx.collect(fare);
 
 // match our event production rate to that of the TaxiRideGenerator
diff --git 
a/common/src/main/java/org/apache/flink/training/exercises/common/utils/DataGenerator.java
 
b/common/src/main/java/org/apache/flink/training/exercises/common/utils/DataGenerator.java
index 10c4c2b..078c43a 100644
--- 
a/common/src/main/java/org/apache/flink/training/exercises/common/utils/DataGenerator.java
+++ 
b/common/src/main/java/org/apache/flink/training/exercises/common/utils/DataGenerator.java
@@ -32,7 +32,7 @@ public class DataGenerator {
 
 private static final int SECONDS_BETWEEN_RIDES = 20;
 private static final int NUMBER_OF_DRIVERS = 200;
-private static final Instant beginTime = 
Instant.parse("2020-01-01T12:00:00.00Z");
+public static final Instant BEGINNING = 
Instant.parse("2020-01-01T12:00:00.00Z");
 
 private transient long rideId;
 
@@ -43,7 +43,7 @@ public class DataGenerator {
 
 /** Deterministically generates and returns the startTime for this ride. */
 public Instant startTime() {
-return beginTime.plusSeconds(SECONDS_BETWEEN_RIDES * rideId);
+return BEGINNING.plusSeconds(SECONDS_BETWEEN_RIDES * rideId);
 }
 
 /** Deterministically generates and returns the endTime for this ride. */
diff --git 
a/hourly-tips/src/test/java/org/apache/flink/training/exercises/hourlytips/HourlyTipsTest.java
 
b/hourly-tips/src/test/java/org/apache/flink/training/exercises/hourlytips/HourlyTipsTest.java
index 8167eba..1e379c3 100644
--- 
a/hourly-tips/src/test/java/org/apache/flink/training/exercises/hourlytips/HourlyTipsTest.java
+++ 
b/hourly-tips/src/test/java/org/apache/flink/training/exercises/hourlytips/HourlyTipsTest.java
@@ -24,6 +24,7 @@ import 
org.apache.flink.runtime.testutils.MiniClusterResourceConfiguration;
 import org.apache.flink.streaming.api.functions.source.SourceFunction;
 import org.apache.flink.test.util.MiniClusterWithClientResource;
 import org.apache.flink.training.exercises.common.datatypes.TaxiFare;
+import org.apache.flink.training.exercises.common.utils.DataGenerator;
 import org.apache.flink.training.exercises.testing.ComposedPipeline;
 import org.apache.flink.training.exercises.tes

[flink-training] 02/02: [hotfix] collect rather than collectWithTimestamp

2021-09-03 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git

commit 1b6720707c6f97b4dc502b58ec8229fac0c4bee5
Author: David Anderson 
AuthorDate: Wed Sep 1 10:19:28 2021 -0600

[hotfix] collect rather than collectWithTimestamp
---
 .../flink/training/exercises/common/sources/TaxiRideGenerator.java| 4 +---
 1 file changed, 1 insertion(+), 3 deletions(-)

diff --git 
a/common/src/main/java/org/apache/flink/training/exercises/common/sources/TaxiRideGenerator.java
 
b/common/src/main/java/org/apache/flink/training/exercises/common/sources/TaxiRideGenerator.java
index b3776d9..975ff2b 100644
--- 
a/common/src/main/java/org/apache/flink/training/exercises/common/sources/TaxiRideGenerator.java
+++ 
b/common/src/main/java/org/apache/flink/training/exercises/common/sources/TaxiRideGenerator.java
@@ -69,9 +69,7 @@ public class TaxiRideGenerator implements 
SourceFunction {
 
 // then emit the new START events (out-of-order)
 java.util.Collections.shuffle(startEvents, new Random(id));
-startEvents
-.iterator()
-.forEachRemaining(r -> ctx.collectWithTimestamp(r, 
r.getEventTimeMillis()));
+startEvents.iterator().forEachRemaining(r -> ctx.collect(r));
 
 // prepare for the next batch
 id += BATCH_SIZE;


[flink-training] 01/02: [FLINK-23926] replace startTime and endTime with a single eventTime field

2021-09-03 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git

commit fd65b110714cbf487435b5dfdf0374550bd0c820
Author: David Anderson 
AuthorDate: Thu Sep 2 10:16:33 2021 -0600

[FLINK-23926] replace startTime and endTime with a single eventTime field
---
 README.md  |  6 +-
 .../exercises/common/datatypes/TaxiFare.java   |  4 +-
 .../exercises/common/datatypes/TaxiRide.java   | 41 +---
 .../common/sources/TaxiRideGenerator.java  |  6 +-
 .../exercises/common/utils/DataGenerator.java  |  4 +-
 .../solutions/hourlytips/HourlyTipsSolution.java   |  3 +-
 .../hourlytips/scala/HourlyTipsSolution.scala  |  2 +-
 long-ride-alerts/DISCUSSION.md | 33 ++
 long-ride-alerts/README.md |  7 ++-
 .../exercises/longrides/LongRidesExercise.java |  2 +-
 .../longrides/scala/LongRidesExercise.scala|  2 +-
 .../solutions/longrides/LongRidesSolution.java | 45 +++--
 .../longrides/scala/LongRidesSolution.scala| 35 +++
 .../exercises/longrides/LongRidesTestBase.java | 10 ++-
 .../exercises/longrides/LongRidesUnitTest.java | 73 ++
 .../ridecleansing/RideCleansingTestBase.java   | 12 +---
 .../ridesandfares/RidesAndFaresTestBase.java   |  3 +-
 .../ridesandfares/RidesAndFaresUnitTest.java   |  4 +-
 18 files changed, 173 insertions(+), 119 deletions(-)

diff --git a/README.md b/README.md
index 1229f5b..040815e 100644
--- a/README.md
+++ b/README.md
@@ -153,9 +153,7 @@ rideId : Long  // a unique id for each ride
 taxiId : Long  // a unique id for each taxi
 driverId   : Long  // a unique id for each driver
 isStart: Boolean   // TRUE for ride start events, FALSE for ride end 
events
-startTime  : Instant   // the start time of a ride
-endTime: Instant   // the end time of a ride,
-   //   "1970-01-01 00:00:00" for start events
+eventTime  : Instant   // the timestamp for this event
 startLon   : Float // the longitude of the ride start location
 startLat   : Float // the latitude of the ride start location
 endLon : Float // the longitude of the ride end location
@@ -171,7 +169,7 @@ There is also a related data set containing fare data about 
those same rides, wi
 rideId : Long  // a unique id for each ride
 taxiId : Long  // a unique id for each taxi
 driverId   : Long  // a unique id for each driver
-startTime  : Instant   // the start time of a ride
+startTime  : Instant   // the start time for this ride
 paymentType: String// CASH or CARD
 tip: Float // tip for this ride
 tolls  : Float // tolls for this ride
diff --git 
a/common/src/main/java/org/apache/flink/training/exercises/common/datatypes/TaxiFare.java
 
b/common/src/main/java/org/apache/flink/training/exercises/common/datatypes/TaxiFare.java
index 367477c..2b1a7f9 100644
--- 
a/common/src/main/java/org/apache/flink/training/exercises/common/datatypes/TaxiFare.java
+++ 
b/common/src/main/java/org/apache/flink/training/exercises/common/datatypes/TaxiFare.java
@@ -130,13 +130,13 @@ public class TaxiFare implements Serializable {
 }
 
 /** Gets the fare's start time. */
-public long getEventTime() {
+public long getEventTimeMillis() {
 return startTime.toEpochMilli();
 }
 
 /** Creates a StreamRecord, using the fare and its timestamp. Used in 
tests. */
 @VisibleForTesting
 public StreamRecord asStreamRecord() {
-return new StreamRecord<>(this, this.getEventTime());
+return new StreamRecord<>(this, this.getEventTimeMillis());
 }
 }
diff --git 
a/common/src/main/java/org/apache/flink/training/exercises/common/datatypes/TaxiRide.java
 
b/common/src/main/java/org/apache/flink/training/exercises/common/datatypes/TaxiRide.java
index 3613762..c99d367 100644
--- 
a/common/src/main/java/org/apache/flink/training/exercises/common/datatypes/TaxiRide.java
+++ 
b/common/src/main/java/org/apache/flink/training/exercises/common/datatypes/TaxiRide.java
@@ -42,8 +42,7 @@ public class TaxiRide implements Comparable, 
Serializable {
 
 /** Creates a new TaxiRide with now as start and end time. */
 public TaxiRide() {
-this.startTime = Instant.now();
-this.endTime = Instant.now();
+this.eventTime = Instant.now();
 }
 
 /** Invents a TaxiRide. */
@@ -52,8 +51,7 @@ public class TaxiRide implements Comparable, 
Serializable {
 
 this.rideId = rideId;
 this.isStart = isStart;
-this.startTime = g.startTime();
-this.endTime = isStart ? Instant.ofEpochMilli(0) : g.endTime();
+this.eventTime = isStart ? g.startTime() : g.endTime();

[flink-training] branch master updated (90da3df -> 1b67207)

2021-09-03 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git.


from 90da3df  [hotfix] trim whitespace before checking enable_scala
 new fd65b11  [FLINK-23926] replace startTime and endTime with a single 
eventTime field
 new 1b67207  [hotfix] collect rather than collectWithTimestamp

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 README.md  |  6 +-
 .../exercises/common/datatypes/TaxiFare.java   |  4 +-
 .../exercises/common/datatypes/TaxiRide.java   | 41 +---
 .../common/sources/TaxiRideGenerator.java  |  8 +--
 .../exercises/common/utils/DataGenerator.java  |  4 +-
 .../solutions/hourlytips/HourlyTipsSolution.java   |  3 +-
 .../hourlytips/scala/HourlyTipsSolution.scala  |  2 +-
 long-ride-alerts/DISCUSSION.md | 33 ++
 long-ride-alerts/README.md |  7 ++-
 .../exercises/longrides/LongRidesExercise.java |  2 +-
 .../longrides/scala/LongRidesExercise.scala|  2 +-
 .../solutions/longrides/LongRidesSolution.java | 45 +++--
 .../longrides/scala/LongRidesSolution.scala| 35 +++
 .../exercises/longrides/LongRidesTestBase.java | 10 ++-
 .../exercises/longrides/LongRidesUnitTest.java | 73 ++
 .../ridecleansing/RideCleansingTestBase.java   | 12 +---
 .../ridesandfares/RidesAndFaresTestBase.java   |  3 +-
 .../ridesandfares/RidesAndFaresUnitTest.java   |  4 +-
 18 files changed, 173 insertions(+), 121 deletions(-)


[flink-training] branch master updated: [hotfix] trim whitespace before checking enable_scala

2021-09-02 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/master by this push:
 new 90da3df  [hotfix] trim whitespace before checking enable_scala
90da3df is described below

commit 90da3df94809f9af00857dd0747c697b6a00e441
Author: Nico Kruber 
AuthorDate: Wed Sep 1 22:22:01 2021 +0200

[hotfix] trim whitespace before checking enable_scala
---
 build.gradle | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/build.gradle b/build.gradle
index 9178112..ce693c6 100644
--- a/build.gradle
+++ b/build.gradle
@@ -43,7 +43,7 @@ allprojects {
 
 subprojects {
 apply plugin: 'java'
-if (project.properties['org.gradle.project.enable_scala'] == 'true') {
+if (project.properties['org.gradle.project.enable_scala'].trim() == 
'true') {
 apply plugin: 'scala'
 }
 apply plugin: 'com.github.johnrengelman.shadow'


[flink-training] 02/02: [hotfix][CI] also verify tests with Scala disabled (default)

2021-09-01 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git

commit f5630bee7dad6801fff359be02b7a93530509104
Author: Nico Kruber 
AuthorDate: Wed Sep 1 10:28:46 2021 +0200

[hotfix][CI] also verify tests with Scala disabled (default)
---
 .travis.yml | 1 +
 1 file changed, 1 insertion(+)

diff --git a/.travis.yml b/.travis.yml
index cc4e815..a1c252d 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -4,6 +4,7 @@ jdk:
   - openjdk11
 
 script:
+  - ./gradlew build --scan --stacktrace -Porg.gradle.project.enable_scala=false
   - ./gradlew build --scan --stacktrace -Porg.gradle.project.enable_scala=true
 
 before_cache:


[flink-training] branch master updated (a66d5f1 -> f5630be)

2021-09-01 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git.


from a66d5f1  [hotfix] update flink version to 1.13.2
 new eb0bd43  [hotfix] test imports wrong solution class
 new f5630be  [hotfix][CI] also verify tests with Scala disabled (default)

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .travis.yml | 1 +
 .../flink/training/exercises/ridesandfares/RidesAndFaresUnitTest.java   | 2 +-
 2 files changed, 2 insertions(+), 1 deletion(-)


[flink-training] 01/02: [hotfix] test imports wrong solution class

2021-09-01 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git

commit eb0bd43a2d19b38ef4b482e7376e321312733764
Author: David Anderson 
AuthorDate: Tue Aug 31 19:48:49 2021 -0600

[hotfix] test imports wrong solution class
---
 .../flink/training/exercises/ridesandfares/RidesAndFaresUnitTest.java   | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git 
a/rides-and-fares/src/test/java/org/apache/flink/training/exercises/ridesandfares/RidesAndFaresUnitTest.java
 
b/rides-and-fares/src/test/java/org/apache/flink/training/exercises/ridesandfares/RidesAndFaresUnitTest.java
index 1fd42c7..93b0167 100644
--- 
a/rides-and-fares/src/test/java/org/apache/flink/training/exercises/ridesandfares/RidesAndFaresUnitTest.java
+++ 
b/rides-and-fares/src/test/java/org/apache/flink/training/exercises/ridesandfares/RidesAndFaresUnitTest.java
@@ -10,7 +10,7 @@ import 
org.apache.flink.training.exercises.common.datatypes.RideAndFare;
 import org.apache.flink.training.exercises.common.datatypes.TaxiFare;
 import org.apache.flink.training.exercises.common.datatypes.TaxiRide;
 import 
org.apache.flink.training.exercises.testing.ComposedRichCoFlatMapFunction;
-import 
org.apache.flink.training.solutions.ridesandfares.scala.RidesAndFaresSolution;
+import org.apache.flink.training.solutions.ridesandfares.RidesAndFaresSolution;
 
 import org.junit.Before;
 import org.junit.Test;


[flink-training] branch master updated (177cadf -> a66d5f1)

2021-09-01 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git.


from 177cadf  [DO-NOT-MERGE][FLINK-23670] Add refactoring commit to 
.git-blame-ignore-revs
 add a66d5f1  [hotfix] update flink version to 1.13.2

No new revisions were added by this update.

Summary of changes:
 build.gradle | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)


[flink-training] branch master updated (bbb051a -> 177cadf)

2021-09-01 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git.


from bbb051a  [FLINK-23653] exercise and test rework (#31)
 add bd4a088  [FLINK-23670] add Scala checks using scalafmt via spotless
 add a22bc8f  [FLINK-23670] Format Scala code with Spotless/scalafmt
 add 177cadf  [DO-NOT-MERGE][FLINK-23670] Add refactoring commit to 
.git-blame-ignore-revs

No new revisions were added by this update.

Summary of changes:
 .git-blame-ignore-revs |  2 +-
 .scalafmt.conf |  1 +
 CONTRIBUTING.md| 12 ++
 build.gradle   | 28 +-
 .../hourlytips/scala/HourlyTipsExercise.scala  |  8 ++-
 .../hourlytips/scala/HourlyTipsSolution.scala  | 24 +--
 .../hourlytips/scala/HourlyTipsTest.scala  | 21 +---
 .../longrides/scala/LongRidesExercise.scala| 25 +--
 .../longrides/scala/LongRidesSolution.scala| 26 ++--
 .../longrides/scala/LongRidesIntegrationTest.scala |  4 +---
 .../longrides/scala/LongRidesUnitTest.scala|  4 +---
 .../scala/RideCleansingExercise.scala  |  8 ++-
 .../scala/RideCleansingSolution.scala  |  8 ++-
 .../scala/RideCleansingIntegrationTest.scala   |  1 -
 .../scala/RideCleansingUnitTest.scala  |  7 +++---
 .../scala/RidesAndFaresExercise.scala  | 12 +-
 .../scala/RidesAndFaresSolution.scala  | 18 +++---
 .../scala/RidesAndFaresIntegrationTest.scala   | 22 -
 .../scala/RidesAndFaresUnitTest.scala  |  6 ++---
 19 files changed, 133 insertions(+), 104 deletions(-)
 create mode 100644 .scalafmt.conf


[flink-training] branch master updated: [FLINK-24022] Change how to enable Scala and use in CI

2021-08-30 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/master by this push:
 new 65cde79  [FLINK-24022] Change how to enable Scala and use in CI
65cde79 is described below

commit 65cde7910e5bff6283eef7ea536fde3ea5d4c9d4
Author: Nico Kruber 
AuthorDate: Fri Aug 27 12:23:50 2021 +0200

[FLINK-24022] Change how to enable Scala and use in CI
---
 .travis.yml   |  2 +-
 README.md | 14 +++---
 build.gradle  |  4 +++-
 gradle.properties |  3 +++
 4 files changed, 14 insertions(+), 9 deletions(-)

diff --git a/.travis.yml b/.travis.yml
index 41126f9..cc4e815 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -4,7 +4,7 @@ jdk:
   - openjdk11
 
 script:
-  - ./gradlew build --scan --stacktrace
+  - ./gradlew build --scan --stacktrace -Porg.gradle.project.enable_scala=true
 
 before_cache:
   - rm -f  $HOME/.gradle/caches/modules-2/modules-2.lock
diff --git a/README.md b/README.md
index 9c60124..5d40526 100644
--- a/README.md
+++ b/README.md
@@ -111,14 +111,14 @@ If you are in China, we recommend configuring the Maven 
repository to use a mirr
 
 The exercises in this project are also available in Scala but due to a couple
 of reported problems from non-Scala users, we decided to disable these by
-default. You can re-enable all Scala exercises and solutions by uncommenting
-this section in our [`build.gradle`](build.gradle) file:
+default. You can re-enable all Scala exercises and solutions adapting the
+[`gradle.properties`](gradle.properties) file like this:
 
-```groovy
-subprojects {
-//...
-apply plugin: 'scala' // optional; uncomment if needed
-}
+```properties
+#...
+
+# Scala exercises can be enabled by setting this to true
+org.gradle.project.enable_scala = true
 ```
 
 You can also selectively apply this plugin in a single subproject if desired.
diff --git a/build.gradle b/build.gradle
index 3117591..93323ee 100644
--- a/build.gradle
+++ b/build.gradle
@@ -43,7 +43,9 @@ allprojects {
 
 subprojects {
 apply plugin: 'java'
-// apply plugin: 'scala' // optional; uncomment if needed
+if (project.properties['org.gradle.project.enable_scala'] == 'true') {
+apply plugin: 'scala'
+}
 apply plugin: 'com.github.johnrengelman.shadow'
 apply plugin: 'checkstyle'
 apply plugin: 'eclipse'
diff --git a/gradle.properties b/gradle.properties
index 09a00db..6be414c 100644
--- a/gradle.properties
+++ b/gradle.properties
@@ -1,2 +1,5 @@
 org.gradle.caching = true
 org.gradle.parallel = true
+
+# Scala exercises can be enabled by setting this to true
+org.gradle.project.enable_scala = false


[flink] branch master updated (25bce2f -> a86f8a5)

2021-08-16 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 25bce2f  [FLINK-21290][table-planner] Support project push down for 
window TVF
 add a86f8a5  [FLINK-23812][rocksdb] support configuring RocksDB logging 
(#16848)

No new revisions were added by this update.

Summary of changes:
 .../rocksdb_configurable_configuration.html| 20 -
 .../state/DefaultConfigurableOptionsFactory.java   | 89 +-
 .../state/RocksDBConfigurableOptions.java  | 64 +++-
 .../state/RocksDBStateBackendConfigTest.java   | 18 +
 4 files changed, 169 insertions(+), 22 deletions(-)


[flink] branch master updated: [FLINK-12141] [API/Type Serialization System] Allow @TypeInfo annotation on POJO field declarations (#8344)

2021-08-16 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new d326b05  [FLINK-12141] [API/Type Serialization System] Allow @TypeInfo 
annotation on POJO field declarations (#8344)
d326b05 is described below

commit d326b0574a373bd5eef63a44261f8762709265f8
Author: Leev 
AuthorDate: Tue Aug 17 02:44:46 2021 +0800

[FLINK-12141] [API/Type Serialization System] Allow @TypeInfo annotation on 
POJO field declarations (#8344)

- build on existing code, extend checks, tests, and docs

Co-authored-by: yangfei5 
Co-authored-by: Nico Kruber 
---
 .../serialization/types_serialization.md   | 15 -
 .../apache/flink/api/common/typeinfo/TypeInfo.java |  2 +-
 .../flink/api/java/typeutils/TypeExtractor.java| 53 ++--
 .../api/java/typeutils/TypeInfoFactoryTest.java| 71 ++
 4 files changed, 133 insertions(+), 8 deletions(-)

diff --git 
a/docs/content/docs/dev/datastream/fault-tolerance/serialization/types_serialization.md
 
b/docs/content/docs/dev/datastream/fault-tolerance/serialization/types_serialization.md
index e289df9..a490dc0 100644
--- 
a/docs/content/docs/dev/datastream/fault-tolerance/serialization/types_serialization.md
+++ 
b/docs/content/docs/dev/datastream/fault-tolerance/serialization/types_serialization.md
@@ -522,8 +522,8 @@ env.getConfig().disableGenericTypes();
 
 A type information factory allows for plugging-in user-defined type 
information into the Flink type system.
 You have to implement `org.apache.flink.api.common.typeinfo.TypeInfoFactory` 
to return your custom type information. 
-The factory is called during the type extraction phase if the corresponding 
type has been annotated 
-with the `@org.apache.flink.api.common.typeinfo.TypeInfo` annotation. 
+The factory is called during the type extraction phase if either the 
corresponding type or a POJO's field using
+this type has been annotated with the 
`@org.apache.flink.api.common.typeinfo.TypeInfo` annotation.
 
 Type information factories can be used in both the Java and Scala API.
 
@@ -553,6 +553,17 @@ public class MyTupleTypeInfoFactory extends 
TypeInfoFactory {
 }
 ```
 
+Instead of annotating the type itself, which may not be possible for 
third-party code, you can also
+annotate the usage of this type inside a valid Flink POJO like this:
+```java
+public class MyPojo {
+  public int id;
+
+  @TypeInfo(MyTupleTypeInfoFactory.class)
+  public MyTuple tuple;
+}
+```
+
 The method `createTypeInfo(Type, Map>)` creates 
type information for the type the factory is targeted for. 
 The parameters provide additional information about the type itself as well as 
the type's generic type parameters if available.
 
diff --git 
a/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInfo.java 
b/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInfo.java
index 9aa01ac..84980be 100644
--- 
a/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInfo.java
+++ 
b/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInfo.java
@@ -35,7 +35,7 @@ import java.lang.reflect.Type;
  * has highest precedence (see {@link TypeExtractor#registerFactory(Type, 
Class)}).
  */
 @Documented
-@Target(ElementType.TYPE)
+@Target({ElementType.TYPE, ElementType.FIELD})
 @Retention(RetentionPolicy.RUNTIME)
 @Public
 public @interface TypeInfo {
diff --git 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
index 9cef04c..c6fb875 100644
--- 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
+++ 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
@@ -1308,6 +1308,19 @@ public class TypeExtractor {
 }
 final Type factoryDefiningType = 
factoryHierarchy.get(factoryHierarchy.size() - 1);
 
+return createTypeInfoFromFactory(
+t, in1Type, in2Type, factoryHierarchy, factory, 
factoryDefiningType);
+}
+
+/** Creates type information using a given factory. */
+@SuppressWarnings("unchecked")
+private  TypeInformation createTypeInfoFromFactory(
+Type t,
+TypeInformation in1Type,
+TypeInformation in2Type,
+List factoryHierarchy,
+TypeInfoFactory factory,
+Type factoryDefiningType) {
 // infer possible type parameters from input
 final Map> genericParams;
 if (factoryDefiningType instanceof ParameterizedType) {
@@ -1689,6 +1702,23 @@ public class TypeExtractor {
 return (TypeInfoFactory) 
InstantiationUtil.instantiate(factoryClass);
 }
 
+/** Returns the type information factory for an annotated field. *

[flink-training] branch master updated: [FLINK-23669] avoid using Scala >= 2.12.8 when setting up an IDE

2021-08-06 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/master by this push:
 new 1704fd9  [FLINK-23669] avoid using Scala >= 2.12.8 when setting up an 
IDE
1704fd9 is described below

commit 1704fd959122aa197b48803441338c1fb0470091
Author: Nico Kruber 
AuthorDate: Fri Aug 6 13:55:45 2021 +0200

[FLINK-23669] avoid using Scala >= 2.12.8 when setting up an IDE
---
 README.md | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/README.md b/README.md
index 6ff5cb5..9c60124 100644
--- a/README.md
+++ b/README.md
@@ -131,8 +131,9 @@ The project needs to be imported as a gradle project into 
your IDE.
 
 Then you should be able to open 
[`RideCleansingTest`](ride-cleansing/src/test/java/org/apache/flink/training/exercises/ridecleansing/RideCleansingTest.java)
 and run this test.
 
-> **:information_source: Note for Scala users:** You will need to use IntelliJ 
with the JetBrains Scala plugin, and you will need to add a Scala 2.12 SDK to 
the Global Libraries section of the Project Structure.
+> **:information_source: Note for Scala users:** You will need to use IntelliJ 
with the JetBrains Scala plugin, and you will need to add a Scala 2.12 SDK to 
the Global Libraries section of the Project Structure as well as to the module 
you are working on.
 > IntelliJ will ask you for the latter when you open a Scala file.
+> Please note that Scala 2.12.8 and above are not supported (see [Flink Scala 
Versions](https://ci.apache.org/projects/flink/flink-docs-stable/docs/dev/datastream/project-configuration/#scala-versions
 for details))!
 
 ## Use the taxi data streams
 


[flink-training] branch master updated: [FLINK-23667] fix code formatting IDE instructions

2021-08-06 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/master by this push:
 new 08f5bf4  [FLINK-23667] fix code formatting IDE instructions
08f5bf4 is described below

commit 08f5bf4d93d241f58d67148dc3e054a2325b64a0
Author: Nico Kruber 
AuthorDate: Fri Aug 6 13:52:55 2021 +0200

[FLINK-23667] fix code formatting IDE instructions

google-java-format should only be applied to Java files.
---
 CONTRIBUTING.md | 7 ---
 1 file changed, 4 insertions(+), 3 deletions(-)

diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index 92cfe6a..9e6eb3b 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -16,11 +16,12 @@ formatting upon saving with these steps:
 1. Install the [google-java-format
plugin](https://plugins.jetbrains.com/plugin/8527-google-java-format) and
enable it for the project
-2. In the plugin settings, change the code style to "AOSP" (4-space indents)
+2. In the plugin settings, enable the plugin and change the code style to 
"AOSP" (4-space indents).
 3. Install the [Save Actions
plugin](https://plugins.jetbrains.com/plugin/7642-save-actions)
-4. Enable the plugin, along with "Optimize imports" and "Reformat file" but
-   ignoring `.*README\.md`.
+4. Enable the plugin, along with "Optimize imports" and "Reformat file".
+5. In the "Save Actions" settings page, set up a "File Path Inclusion" for 
`.*\.java`. Otherwise, you will get
+   unintended reformatting in other files you edit.
 
 ## Ignore refactoring commits
 


[flink] branch release-1.13 updated: [FLINK-23102][rest] Return empty FlameGraph if feature is disabled

2021-08-03 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch release-1.13
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.13 by this push:
 new 39ad632  [FLINK-23102][rest] Return empty FlameGraph if feature is 
disabled
39ad632 is described below

commit 39ad632a60896c94fb796e46c1bacfea32ea68d8
Author: Yao Zhang 
AuthorDate: Fri Jul 2 14:58:25 2021 +0800

[FLINK-23102][rest] Return empty FlameGraph if feature is disabled
---
 .../shortcodes/generated/rest_v1_dispatcher.html   | 100 +
 .../src/test/resources/rest_api_v1.snapshot|  51 ++-
 .../job-overview-drawer-flamegraph.component.html  |  10 ++-
 .../handler/job/JobVertexFlameGraphHandler.java|  41 -
 .../runtime/webmonitor/WebMonitorEndpoint.java |  16 ++--
 .../webmonitor/threadinfo/JobVertexFlameGraph.java |  13 ++-
 6 files changed, 218 insertions(+), 13 deletions(-)

diff --git a/docs/layouts/shortcodes/generated/rest_v1_dispatcher.html 
b/docs/layouts/shortcodes/generated/rest_v1_dispatcher.html
index fca2212..b68cf80 100644
--- a/docs/layouts/shortcodes/generated/rest_v1_dispatcher.html
+++ b/docs/layouts/shortcodes/generated/rest_v1_dispatcher.html
@@ -3777,6 +3777,106 @@ Using 'curl' you can upload a jar via 'curl -X POST -H 
"Expect:" -F "jarfile=@pa
 
   
 
+  /jobs/:jobid/vertices/:vertexid/flamegraph
+
+
+  Verb: GET
+  Response code: 200 OK
+
+
+  Returns flame graph information for a vertex, and may 
initiate flame graph sampling if necessary.
+
+
+  Path parameters
+
+
+  
+
+jobid - 32-character hexadecimal string value that identifies 
a job.
+vertexid - 32-character hexadecimal string value that 
identifies a job vertex.
+
+  
+
+
+  Query parameters
+
+
+  
+
+type (optional): String value that specifies the Flame Graph 
type. Supported options are: "[FULL, ON_CPU, OFF_CPU]".
+
+  
+
+
+  
+  
+
+  
+Request
+▾
+  
+  
+  
+  
+
+{}
+  
+  
+
+  
+  
+
+
+  
+  
+
+  
+Response
+▾
+  
+  
+  
+  
+
+{
+  "type" : "object",
+  "id" : 
"urn:jsonschema:org:apache:flink:runtime:webmonitor:threadinfo:JobVertexFlameGraph",
+  "properties" : {
+"data" : {
+  "type" : "object",
+  "id" : 
"urn:jsonschema:org:apache:flink:runtime:webmonitor:threadinfo:JobVertexFlameGraph:Node",
+  "properties" : {
+"children" : {
+  "type" : "array",
+  "items" : {
+"type" : "object",
+"$ref" : 
"urn:jsonschema:org:apache:flink:runtime:webmonitor:threadinfo:JobVertexFlameGraph:Node"
+  }
+},
+"name" : {
+  "type" : "string"
+},
+"value" : {
+  "type" : "integer"
+}
+  }
+},
+"endTimestamp" : {
+  "type" : "integer"
+}
+  }
+}
+  
+  
+
+  
+  
+
+  
+
+
+  
+
   /jobs/:jobid/vertices/:vertexid/metrics
 
 
diff --git a/flink-runtime-web/src/test/resources/rest_api_v1.snapshot 
b/flink-runtime-web/src/test/resources/rest_api_v1.snapshot
index bf1f190..7040dd8 100644
--- a/flink-runtime-web/src/test/resources/rest_api_v1.snapshot
+++ b/flink-runtime-web/src/test/resources/rest_api_v1.snapshot
@@ -2131,6 +2131,55 @@
   }
 }
   }, {
+"url" : "/jobs/:jobid/vertices/:vertexid/flamegraph",
+"method" : "GET",
+"status-code" : "200 OK",
+"file-upload" : false,
+"path-parameters" : {
+  "pathParameters" : [ {
+"key" : "jobid"
+  }, {
+"key" : "vertexid"
+  } ]
+},
+"query-parameters" : {
+  "queryParameters" : [ {
+"key" : "type",
+"mandatory" : false
+  } ]
+},
+"request" : {
+  "type" : "any"
+},
+"response" : {
+  "type" : "object",
+  "id" : 
"urn:jsonschema:org:apache:flink:runtime:webmonitor:threadinfo:JobVertexFlameGraph",
+  "properties" : {
+"endTimestamp" : {
+  "type" : 

[flink] branch master updated (c99895a -> b396c35)

2021-07-16 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from c99895a  [FLINK-21776][metrics] Migrate JobManagerJobMG instantiations 
to factory method
 add 3ebbe60  [hotfix][flink-core] fix typo in error message
 add f8e11e1  [FLINK-23311][tests] fix PojoSerializerTest user POJO not 
taking dumm5 into account for hashCode and equals
 add a238be1  [FLINK-23311][tests] nicer error messages in 
PojoSerializerTest via hamcrest
 add b396c35  [FLINK-23311][tests] fix wrong parameter order in assertSame()

No new revisions were added by this update.

Summary of changes:
 .../apache/flink/api/common/typeinfo/Types.java|  2 +-
 .../java/typeutils/runtime/PojoSerializerTest.java | 14 +---
 .../PojoSerializerUpgradeTestSpecifications.java   | 40 +++---
 3 files changed, 31 insertions(+), 25 deletions(-)


[flink-training] branch master updated: [hotfix] fix formatting error for spotless

2021-07-16 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/master by this push:
 new 58c5551  [hotfix] fix formatting error for spotless
58c5551 is described below

commit 58c5551e9407d4917d3139a9125f7b8570d01b2c
Author: Nico Kruber 
AuthorDate: Fri Jul 16 09:30:06 2021 +0200

[hotfix] fix formatting error for spotless
---
 README.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/README.md b/README.md
index d0f1316..15f98ce 100644
--- a/README.md
+++ b/README.md
@@ -270,7 +270,7 @@ mainClassName = ext.javaExerciseClassName
 
 ### Useful Gradle Commands and Tricks
 
- Clean Build with all Checks  
+ Clean Build with all Checks
 
 ```bash
 ./gradlew clean check shadowJar


[flink-training] 02/02: [FLINK-23340] add more dev setup instructions

2021-07-15 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git

commit 857dd4b66c228a46044b766b27f59c03e725e287
Author: Nico Kruber 
AuthorDate: Fri Jul 9 14:42:27 2021 +0200

[FLINK-23340] add more dev setup instructions
---
 README.md | 112 +-
 1 file changed, 111 insertions(+), 1 deletion(-)

diff --git a/README.md b/README.md
index 92c84d9..d0f1316 100644
--- a/README.md
+++ b/README.md
@@ -56,7 +56,7 @@ Flink supports Linux, OS X, and Windows as development 
environments for Flink pr
 - a JDK for Java 8 or Java 11 (a JRE is not sufficient; other versions of Java 
are currently not supported)
 - Git
 - an IDE for Java (and/or Scala) development with Gradle support.
-  We recommend IntelliJ, but Eclipse or Visual Studio Code can also be used so 
long as you stick to Java. For Scala you will need to use IntelliJ (and its 
Scala plugin).
+  We recommend [IntelliJ](https://www.jetbrains.com/idea/), but 
[Eclipse](https://www.eclipse.org/downloads/) or [Visual Studio 
Code](https://code.visualstudio.com/) (with the [Java extension 
pack](https://code.visualstudio.com/docs/java/java-tutorial)) can also be used 
so long as you stick to Java. For Scala, you will need to use IntelliJ (and its 
[Scala plugin](https://plugins.jetbrains.com/plugin/1347-scala/)).
 
 > **:information_source: Note for Windows users:** The examples of shell 
 > commands provided in the training instructions are for UNIX systems. To make 
 > things easier, you may find it worthwhile to setup cygwin or WSL. For 
 > developing Flink jobs, Windows works reasonably well: you can run a Flink 
 > cluster on a single machine, submit jobs, run the webUI, and execute jobs in 
 > the IDE.
 
@@ -217,6 +217,116 @@ Now you are ready to begin with the first exercise in our 
[**Labs**](LABS-OVERVI
 
 -
 
+## How to work on this project
+
+> **:heavy_exclamation_mark: Important:** This section contains tips for 
developers who are
+> maintaining the `flink-training` project (not so much people doing the
+> training).
+
+The following sections apply on top of the [Setup 
Instructions](#set-up-your-development-environment) above.
+
+### Code Formatting
+
+Just like [Apache Flink](https://github.com/apache/flink), we use the [Spotless
+plugin](https://github.com/diffplug/spotless/tree/main/plugin-maven) together
+with [google-java-format](https://github.com/google/google-java-format) to
+format our Java code. You can configure your IDE to automatically apply
+formatting on saving with these steps:
+
+1. Install the [google-java-format
+   plugin](https://plugins.jetbrains.com/plugin/8527-google-java-format) and
+   enable it for the project
+2. In the plugin settings, change the code style to "AOSP" (4-space indents)
+3. Install the [Save Actions
+   plugin](https://plugins.jetbrains.com/plugin/7642-save-actions)
+4. Enable the plugin, along with "Optimize imports" and "Reformat file" but
+   ignoring `.*README\.md`.
+
+### Ignoring Refactoring Commits
+
+We keep a list of big refactoring commits in `.git-blame-ignore-revs`. When 
looking at change annotations using `git blame` it's helpful to ignore these. 
You can configure git and your IDE to do so using:
+
+```bash
+git config blame.ignoreRevsFile .git-blame-ignore-revs
+```
+
+### Adding new exercises
+
+If you want to add a new exercise, we recommend copying an existing one and
+adapting it accordingly. Make sure the new subproject's `build.gradle` file
+contains appropriate class name properties so that we can create the right
+tasks for [Running Tests and Solutions on the Command 
Line](#running-exercises-tests-and-solutions-on-the-command-line), e.g.:
+
+```groovy
+ext.javaExerciseClassName = 
'org.apache.flink.training.exercises.ridesandfares.RidesAndFaresExercise'
+ext.scalaExerciseClassName = 
'org.apache.flink.training.exercises.ridesandfares.scala.RidesAndFaresExercise'
+ext.javaSolutionClassName = 
'org.apache.flink.training.solutions.ridesandfares.RidesAndFaresSolution'
+ext.scalaSolutionClassName = 
'org.apache.flink.training.solutions.ridesandfares.scala.RidesAndFaresSolution'
+
+apply plugin: 'application'
+
+mainClassName = ext.javaExerciseClassName
+```
+
+### Useful Gradle Commands and Tricks
+
+ Clean Build with all Checks  
+
+```bash
+./gradlew clean check shadowJar
+./gradlew clean check shadowJar --no-build-cache
+```
+
+ Force a Re-run of all Tests Only
+
+```bash
+./gradlew cleanTest test  --no-build-cache
+```
+
+> **:information_source: Note:** Ignoring the build-cache is required if you 
really want to run the test again (without any changes in code).
+> Otherwise the test tasks will just pull the latest test results from the 
cache.
+
+ Fix Formatting
+
+```bash
+./gradlew spotlessApply
+./gradlew :rides-and-f

[flink-training] 01/02: [hotfix] minor improvements to the README

2021-07-15 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git

commit e5a38fcb20be8b27826ac0437706a5abfe2aa309
Author: Nico Kruber 
AuthorDate: Fri Jul 9 14:42:12 2021 +0200

[hotfix] minor improvements to the README
---
 README.md | 10 +-
 1 file changed, 5 insertions(+), 5 deletions(-)

diff --git a/README.md b/README.md
index 49fa987..92c84d9 100644
--- a/README.md
+++ b/README.md
@@ -23,7 +23,7 @@ Exercises that go along with the training content in the 
documentation.
 
 ## Table of Contents
 
-[**Setup your Development Environment**](#setup-your-development-environment)
+[**Set up your Development Environment**](#set-up-your-development-environment)
 
 1. [Software requirements](#software-requirements)
 1. [Clone and build the flink-training 
project](#clone-and-build-the-flink-training-project)
@@ -45,7 +45,7 @@ Exercises that go along with the training content in the 
documentation.
 
 [**License**](#license)
 
-## Setup your Development Environment
+## Set up your Development Environment
 
 The following instructions guide you through the process of setting up a 
development environment for the purpose of developing, debugging, and executing 
solutions to the Flink developer training exercises and examples.
 
@@ -53,7 +53,7 @@ The following instructions guide you through the process of 
setting up a develop
 
 Flink supports Linux, OS X, and Windows as development environments for Flink 
programs and local execution. The following software is required for a Flink 
development setup and should be installed on your system:
 
-- a JDK for Java 8 or Java 11 (a JRE is not sufficient; other versions of Java 
are not supported)
+- a JDK for Java 8 or Java 11 (a JRE is not sufficient; other versions of Java 
are currently not supported)
 - Git
 - an IDE for Java (and/or Scala) development with Gradle support.
   We recommend IntelliJ, but Eclipse or Visual Studio Code can also be used so 
long as you stick to Java. For Scala you will need to use IntelliJ (and its 
Scala plugin).
@@ -169,7 +169,7 @@ In the hands-on sessions you will implement Flink programs 
using various Flink A
 
 The following steps guide you through the process of using the provided data 
streams, implementing your first Flink streaming program, and executing your 
program in your IDE.
 
-We assume you have setup your development environment according to our [setup 
guide above](#setup-your-development-environment).
+We assume you have set up your development environment according to our [setup 
guide above](#set-up-your-development-environment).
 
 ### Learn about the data
 
@@ -219,4 +219,4 @@ Now you are ready to begin with the first exercise in our 
[**Labs**](LABS-OVERVI
 
 ## License
 
-The code in this repository is licensed under the Apache Software License 2.
+The code in this repository is licensed under the [Apache Software License 
2](LICENSE).


[flink-training] branch master updated (5d3e49d -> 857dd4b)

2021-07-15 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git.


from 5d3e49d  [FLINK-23339] Make scala plugin optional
 new e5a38fc  [hotfix] minor improvements to the README
 new 857dd4b  [FLINK-23340] add more dev setup instructions

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 README.md | 122 ++
 1 file changed, 116 insertions(+), 6 deletions(-)


[flink-training] branch master updated: [FLINK-23339] Make scala plugin optional

2021-07-14 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/master by this push:
 new 5d3e49d  [FLINK-23339] Make scala plugin optional
5d3e49d is described below

commit 5d3e49d288f06ff5fca7db8423c4f26fd623947f
Author: Nico Kruber 
AuthorDate: Tue Jul 13 22:28:39 2021 +0200

[FLINK-23339] Make scala plugin optional
---
 README.md| 16 
 build.gradle | 34 +++---
 2 files changed, 39 insertions(+), 11 deletions(-)

diff --git a/README.md b/README.md
index 37c1d26..49fa987 100644
--- a/README.md
+++ b/README.md
@@ -101,6 +101,22 @@ If you are in China, we recommend configuring the maven 
repository to use a mirr
 
 
 
+### Enable Scala (optional)
+
+The exercises in this project are also available in Scala but due to a couple
+of reported problems from non-Scala users, we decided to disable these by
+default. You can re-enable all Scala exercises and solutions by uncommenting
+the Scala plugin in our [`build.gradle`](build.gradle) file:
+
+```groovy
+subprojects {
+//...
+apply plugin: 'scala' // optional; uncomment if needed
+}
+```
+
+You can also selectively apply this plugin in a single subproject if desired.
+
 ### Import the flink-training project into your IDE
 
 The project needs to be imported as a gradle project into your IDE.
diff --git a/build.gradle b/build.gradle
index 0442b26..3117591 100644
--- a/build.gradle
+++ b/build.gradle
@@ -43,7 +43,7 @@ allprojects {
 
 subprojects {
 apply plugin: 'java'
-apply plugin: 'scala' // optional; uncomment if needed
+// apply plugin: 'scala' // optional; uncomment if needed
 apply plugin: 'com.github.johnrengelman.shadow'
 apply plugin: 'checkstyle'
 apply plugin: 'eclipse'
@@ -97,7 +97,9 @@ subprojects {
 // add solution source dirs:
 sourceSets {
 main.java.srcDirs += 'src/solution/java'
-main.scala.srcDirs += 'src/solution/scala'
+tasks.withType(ScalaCompile) {
+main.scala.srcDirs += 'src/solution/scala'
+}
 
 // Add shadow configuration to runtime class path so that the
 // dynamically-generated tasks by IntelliJ are able to run and have
@@ -110,14 +112,14 @@ subprojects {
 }
 
 project.plugins.withId('application') {
-['javaExerciseClassName', 'scalaExerciseClassName',
- 'javaSolutionClassName', 'scalaSolutionClassName'].each { property ->
-if (project.ext.has(property)) {
-project.tasks.create(name: 
classNamePropertyToTaskName(property), type: JavaExec) {
-classpath = project.sourceSets.main.runtimeClasspath
-mainClass = project.ext.get(property)
-group = 'flink-training'
-}
+['javaExerciseClassName', 'javaSolutionClassName'].each { property ->
+createTrainingRunTask(project, property)
+}
+}
+pluginManager.withPlugin('scala') {
+project.plugins.withId('application') {
+['scalaExerciseClassName', 'scalaSolutionClassName'].each { 
property ->
+createTrainingRunTask(project, property)
 }
 }
 }
@@ -179,7 +181,17 @@ tasks.register('printRunTasks') {
 }
 }
 
-static def classNamePropertyToTaskName(String property) {
+static def void createTrainingRunTask(Project project, String property) {
+if (project.ext.has(property)) {
+project.tasks.create(name: classNamePropertyToTaskName(property), 
type: JavaExec) {
+classpath = project.sourceSets.main.runtimeClasspath
+mainClass = project.ext.get(property)
+group = 'flink-training'
+}
+}
+}
+
+static def String classNamePropertyToTaskName(String property) {
 return 'run' +
 property.charAt(0).toString().toUpperCase() +
 property.substring(1, property.lastIndexOf('ClassName'))


[flink] branch release-1.13 updated: [FLINK-23312][ci] speed up compilation for e2e tests

2021-07-14 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch release-1.13
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.13 by this push:
 new 3618770  [FLINK-23312][ci] speed up compilation for e2e tests
3618770 is described below

commit 361877021d5ad9d6b9f076f4cffbe0472262cd8f
Author: Nico Kruber 
AuthorDate: Wed Jun 16 18:16:32 2021 +0200

[FLINK-23312][ci] speed up compilation for e2e tests

The "compile" builder already applies all checks so we can use -Dfast here;
also, the web UI is not actually needed in the E2E tests.
---
 tools/azure-pipelines/jobs-template.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/tools/azure-pipelines/jobs-template.yml 
b/tools/azure-pipelines/jobs-template.yml
index df1c7ee..144c176 100644
--- a/tools/azure-pipelines/jobs-template.yml
+++ b/tools/azure-pipelines/jobs-template.yml
@@ -226,7 +226,7 @@ jobs:
 sudo apt install ./libssl1.0.0_1.0.2n-1ubuntu5.6_amd64.deb
   displayName: Prepare E2E run
   condition: not(eq(variables['SKIP'], '1'))
-- script: ${{parameters.environment}} ./tools/ci/compile.sh
+- script: ${{parameters.environment}} PROFILE="$PROFILE -Dfast 
-Pskip-webui-build" ./tools/ci/compile.sh
   displayName: Build Flink
   condition: not(eq(variables['SKIP'], '1'))
 - script: ${{parameters.environment}} FLINK_DIR=`pwd`/build-target 
./tools/azure-pipelines/uploading_watchdog.sh 
flink-end-to-end-tests/run-nightly-tests.sh e2e


[flink] branch release-1.12 updated (331ae2c -> adfd85f)

2021-07-14 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch release-1.12
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 331ae2c  [FLINK-22545][coordination] Fix check during creation of 
Source Coordinator thread.
 add adfd85f  [FLINK-23312][ci] speed up compilation for e2e tests

No new revisions were added by this update.

Summary of changes:
 tools/azure-pipelines/jobs-template.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)


[flink] branch release-1.11 updated: [FLINK-23312][ci] speed up compilation for e2e tests

2021-07-14 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch release-1.11
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.11 by this push:
 new ed0df86  [FLINK-23312][ci] speed up compilation for e2e tests
ed0df86 is described below

commit ed0df863d18d6e5849589e1150cbf8d340588bc8
Author: Nico Kruber 
AuthorDate: Wed Jun 16 18:16:32 2021 +0200

[FLINK-23312][ci] speed up compilation for e2e tests

The "compile" builder already applies all checks so we can use -Dfast here;
also, the web UI is not actually needed in the E2E tests.
---
 tools/azure-pipelines/jobs-template.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/tools/azure-pipelines/jobs-template.yml 
b/tools/azure-pipelines/jobs-template.yml
index 1e91ed9..f016b35 100644
--- a/tools/azure-pipelines/jobs-template.yml
+++ b/tools/azure-pipelines/jobs-template.yml
@@ -176,7 +176,7 @@ jobs:
 - script: ./tools/azure-pipelines/free_disk_space.sh
   displayName: Free up disk space
 - script: sudo apt-get install -y bc
-- script: ${{parameters.environment}} STAGE=compile 
./tools/azure_controller.sh compile
+- script: ${{parameters.environment}} PROFILE="$PROFILE -Dfast 
-Pskip-webui-build" STAGE=compile ./tools/azure_controller.sh compile
   displayName: Build Flink
 # TODO remove pre-commit tests script by adding the tests to the nightly 
script
 #- script: FLINK_DIR=build-target 
./flink-end-to-end-tests/run-pre-commit-tests.sh


[flink] branch master updated (70d3f84 -> 2072026)

2021-07-14 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 70d3f84  Revert "[FLINK-18783] Load Akka with separate classloader"
 add 2072026  [FLINK-23312][ci] speed up compilation for e2e tests

No new revisions were added by this update.

Summary of changes:
 tools/azure-pipelines/jobs-template.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)


[flink-training] branch master updated (9e90776 -> 5d17ed7)

2021-07-13 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git.


from 9e90776  [FLINK-23335][gradle] add separate run tasks for Java/Scala 
exercises and solutions
 new 8399d88  [FLINK-23338] Add Spotless plugin with Google AOSP style
 new c4baefd  [FLINK-23338] Format code with Spotless/google-java-format
 new 5d17ed7  [FLINK-23338] Add .git-blame-ignore-revs for ignoring 
refactor commit

The 3 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .git-blame-ignore-revs |   1 +
 README.md  |   2 +-
 build.gradle   |  27 +
 .../examples/ridecount/RideCountExample.java   |  60 +-
 .../exercises/common/datatypes/TaxiFare.java   | 154 ++--
 .../exercises/common/datatypes/TaxiRide.java   | 297 ---
 .../common/sources/TaxiFareGenerator.java  |  35 +-
 .../common/sources/TaxiRideGenerator.java  | 109 +--
 .../exercises/common/utils/DataGenerator.java  | 304 ---
 .../exercises/common/utils/ExerciseBase.java   |  98 +--
 .../training/exercises/common/utils/GeoUtils.java  | 433 +-
 .../common/utils/MissingSolutionException.java |  10 +-
 .../exercises/testing/TaxiRideTestBase.java| 291 +++
 .../training/exercises/testing/TestSource.java |  44 +-
 config/checkstyle/checkstyle.xml   | 891 ++---
 config/checkstyle/suppressions.xml |  10 +-
 hourly-tips/DISCUSSION.md  |  34 +-
 .../exercises/hourlytips/HourlyTipsExercise.java   |  38 +-
 .../solutions/hourlytips/HourlyTipsSolution.java   |  92 +--
 .../exercises/hourlytips/HourlyTipsTest.java   |  97 +--
 long-ride-alerts/DISCUSSION.md |   2 +-
 long-ride-alerts/README.md |   8 +-
 .../exercises/longrides/LongRidesExercise.java |  61 +-
 .../solutions/longrides/LongRidesSolution.java | 136 ++--
 .../exercises/longrides/LongRidesTest.java | 178 ++--
 .../ridecleansing/RideCleansingExercise.java   |  57 +-
 .../ridecleansing/RideCleansingSolution.java   |  58 +-
 .../exercises/ridecleansing/RideCleansingTest.java |  73 +-
 .../ridesandfares/RidesAndFaresExercise.java   |  69 +-
 .../ridesandfares/RidesAndFaresSolution.java   | 162 ++--
 .../exercises/ridesandfares/RidesAndFaresTest.java |  93 +--
 31 files changed, 1950 insertions(+), 1974 deletions(-)
 create mode 100644 .git-blame-ignore-revs


[flink-training] 03/03: [FLINK-23338] Add .git-blame-ignore-revs for ignoring refactor commit

2021-07-13 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git

commit 5d17ed7eafbc636a510b4f933101226835fefa24
Author: Nico Kruber 
AuthorDate: Fri Jul 2 17:46:10 2021 +0200

[FLINK-23338] Add .git-blame-ignore-revs for ignoring refactor commit

This file can be used via:

$ git config blame.ignoreRevsFile .git-blame-ignore-revs

This closes #27.
---
 .git-blame-ignore-revs | 1 +
 1 file changed, 1 insertion(+)

diff --git a/.git-blame-ignore-revs b/.git-blame-ignore-revs
new file mode 100644
index 000..b41871e
--- /dev/null
+++ b/.git-blame-ignore-revs
@@ -0,0 +1 @@
+c4baefd1830abe3ba9bcf200be0f5d00095cae12


[flink-training] 02/03: [FLINK-23338] Format code with Spotless/google-java-format

2021-07-13 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git

commit c4baefd1830abe3ba9bcf200be0f5d00095cae12
Author: Rufus Refactor 
AuthorDate: Tue Jul 13 21:45:34 2021 +0200

[FLINK-23338] Format code with Spotless/google-java-format
---
 README.md  |   2 +-
 .../examples/ridecount/RideCountExample.java   |  60 +--
 .../exercises/common/datatypes/TaxiFare.java   | 154 
 .../exercises/common/datatypes/TaxiRide.java   | 297 +++---
 .../common/sources/TaxiFareGenerator.java  |  35 +-
 .../common/sources/TaxiRideGenerator.java  | 109 +++---
 .../exercises/common/utils/DataGenerator.java  | 304 +++
 .../exercises/common/utils/ExerciseBase.java   |  98 ++---
 .../training/exercises/common/utils/GeoUtils.java  | 433 ++---
 .../common/utils/MissingSolutionException.java |  10 +-
 .../exercises/testing/TaxiRideTestBase.java| 291 +++---
 .../training/exercises/testing/TestSource.java |  44 +--
 hourly-tips/DISCUSSION.md  |  34 +-
 .../exercises/hourlytips/HourlyTipsExercise.java   |  38 +-
 .../solutions/hourlytips/HourlyTipsSolution.java   |  92 ++---
 .../exercises/hourlytips/HourlyTipsTest.java   |  97 ++---
 long-ride-alerts/DISCUSSION.md |   2 +-
 long-ride-alerts/README.md |   8 +-
 .../exercises/longrides/LongRidesExercise.java |  61 ++-
 .../solutions/longrides/LongRidesSolution.java | 136 +++
 .../exercises/longrides/LongRidesTest.java | 178 +
 .../ridecleansing/RideCleansingExercise.java   |  57 ++-
 .../ridecleansing/RideCleansingSolution.java   |  58 +--
 .../exercises/ridecleansing/RideCleansingTest.java |  73 ++--
 .../ridesandfares/RidesAndFaresExercise.java   |  69 ++--
 .../ridesandfares/RidesAndFaresSolution.java   | 162 
 .../exercises/ridesandfares/RidesAndFaresTest.java |  93 ++---
 27 files changed, 1489 insertions(+), 1506 deletions(-)

diff --git a/README.md b/README.md
index 834032a..37c1d26 100644
--- a/README.md
+++ b/README.md
@@ -111,7 +111,7 @@ Once that’s done you should be able to open 
[`RideCleansingTest`](ride-cleansi
 
 ## Using the Taxi Data Streams
 
-These exercises use data 
[generators](common/src/main/java/org/apache/flink/training/exercises/common/sources)
 that produce simulated event streams 
+These exercises use data 
[generators](common/src/main/java/org/apache/flink/training/exercises/common/sources)
 that produce simulated event streams
 inspired by those shared by the [New York City Taxi & Limousine 
Commission](http://www.nyc.gov/html/tlc/html/home/home.shtml)
 in their public [data set](https://uofi.app.box.com/NYCtaxidata) about taxi 
rides in New York City.
 
diff --git 
a/common/src/main/java/org/apache/flink/training/examples/ridecount/RideCountExample.java
 
b/common/src/main/java/org/apache/flink/training/examples/ridecount/RideCountExample.java
index 74e56fe..36b7b89 100644
--- 
a/common/src/main/java/org/apache/flink/training/examples/ridecount/RideCountExample.java
+++ 
b/common/src/main/java/org/apache/flink/training/examples/ridecount/RideCountExample.java
@@ -29,44 +29,46 @@ import 
org.apache.flink.training.exercises.common.sources.TaxiRideGenerator;
 /**
  * Example that counts the rides for each driver.
  *
- * Note that this is implicitly keeping state for each driver.
- * This sort of simple, non-windowed aggregation on an unbounded set of keys 
will use an unbounded amount of state.
- * When this is an issue, look at the SQL/Table API, or ProcessFunction, or 
state TTL, all of which provide
+ * Note that this is implicitly keeping state for each driver. This sort of 
simple, non-windowed
+ * aggregation on an unbounded set of keys will use an unbounded amount of 
state. When this is an
+ * issue, look at the SQL/Table API, or ProcessFunction, or state TTL, all of 
which provide
  * mechanisms for expiring state for stale keys.
  */
 public class RideCountExample {
 
-   /**
-* Main method.
-*
-* @throws Exception which occurs during job execution.
-*/
-   public static void main(String[] args) throws Exception {
+/**
+ * Main method.
+ *
+ * @throws Exception which occurs during job execution.
+ */
+public static void main(String[] args) throws Exception {
 
-   // set up streaming execution environment
-   StreamExecutionEnvironment env = 
StreamExecutionEnvironment.getExecutionEnvironment();
+// set up streaming execution environment
+StreamExecutionEnvironment env = 
StreamExecutionEnvironment.getExecutionEnvironment();
 
-   // start the data generator
-   DataStream rides = env.addSource(new 
TaxiRideGener

[flink-training] 01/03: [FLINK-23338] Add Spotless plugin with Google AOSP style

2021-07-13 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git

commit 8399d88b3638d69db01cea341631979a281d1072
Author: Nico Kruber 
AuthorDate: Fri Jul 2 17:36:31 2021 +0200

[FLINK-23338] Add Spotless plugin with Google AOSP style

Use the same format as defined by the main Flink project.
See https://issues.apache.org/jira/browse/FLINK-20651
---
 build.gradle   |  27 ++
 config/checkstyle/checkstyle.xml   | 891 ++---
 config/checkstyle/suppressions.xml |  10 +-
 3 files changed, 460 insertions(+), 468 deletions(-)

diff --git a/build.gradle b/build.gradle
index 3702325..0442b26 100644
--- a/build.gradle
+++ b/build.gradle
@@ -17,6 +17,7 @@
 
 plugins {
 id 'com.github.johnrengelman.shadow' version '7.0.0' apply false
+id "com.diffplug.spotless" version "5.14.0" apply false
 }
 
 description = "Flink Training Exercises"
@@ -24,6 +25,20 @@ description = "Flink Training Exercises"
 allprojects {
 group = 'org.apache.flink'
 version = '1.13-SNAPSHOT'
+
+apply plugin: 'com.diffplug.spotless'
+
+spotless {
+format 'misc', {
+// define the files to apply `misc` to
+target '*.gradle', '*.md', '.gitignore'
+
+// define the steps to apply to those files
+trimTrailingWhitespace()
+indentWithSpaces(4)
+endWithNewline()
+}
+}
 }
 
 subprojects {
@@ -107,6 +122,18 @@ subprojects {
 }
 }
 
+spotless {
+java {
+googleJavaFormat('1.7').aosp()
+
+// \# refers to static imports
+importOrder('org.apache.flink', 'org.apache.flink.shaded', '', 
'javax', 'java', 'scala', '\\#')
+removeUnusedImports()
+
+targetExclude("**/generated*/*.java")
+}
+}
+
 jar {
 manifest {
 attributes 'Built-By': System.getProperty('user.name'),
diff --git a/config/checkstyle/checkstyle.xml b/config/checkstyle/checkstyle.xml
index e656fda..56d219d 100644
--- a/config/checkstyle/checkstyle.xml
+++ b/config/checkstyle/checkstyle.xml
@@ -31,540 +31,505 @@ This file is based on the checkstyle file of Apache Beam.
 
 
 
-  
-
-  
-
-  
-
-
-  
-
-  
-
-
-
-
-  
-
-  
-
-
-
-  
-
-  
-
-
-
-  
-
-  
-  
-
-  
-
-  
-
-  
-
-  
-  
-
-  
-
-  
-
-  
-
-  
-  
-
-
-
-  
-  
-  
-
-
-
-
-
-  
-  
-  
-
-
-
-
-  
-  
-  
-
-
-
-  
-  
-  
-
-
-
-  
-  
-  
-
-
-
-
-
-
-  
-  
-  
-
-
-
-  
-  
-  
-
-
-
-  
-  
-  
-
-
-  
-  
-  
-
-
-
-  
-  
-  
-
-
-  
-  
-  
-
-
-  
-  
-  
-
-
-  
-  
-  
+
+
+
 
 
-
-
-
-
-  
-  
+
+
+
+
+
 
 
-
-
-
-  
-  
-  
+
+
+
+
 
 
-
-  
-  
-  
-  
-  
-  
-  
-  
+
+
+
+
 
 
-
-  
-
-
-
-  
-
-
-
-  
-  
-
-
-
-
-  
+
+
+
 
 
-
-
-  
-  
+
+
 
 
-
-  
-  
-  
-
+
+
 
 
 
-
-
-
-  
-  
-  
-  
-  
-
-
-
+
+
 
 
-
-
+
+
 
-
-  
-  
-  
-
+
+
+
+
+
+
 
-
-  
-  
-
+
 
--->
+
+
+
+
+
+
 
-
+
+
+
+
+
 
-
-  
-  
-  
-  
-
+
+
+
+
+
 
-
-  
-  
-  
-
+
-  
-  
-  
-  
-  
-  
-  
-  
-
+IllegalImport cannot

[flink-training] branch master updated: [FLINK-23335][gradle] add separate run tasks for Java/Scala exercises and solutions

2021-07-13 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/master by this push:
 new 9e90776  [FLINK-23335][gradle] add separate run tasks for Java/Scala 
exercises and solutions
9e90776 is described below

commit 9e907760965a883881d7891baa0d6a82011cb626
Author: Nico Kruber 
AuthorDate: Fri Jul 2 15:43:13 2021 +0200

[FLINK-23335][gradle] add separate run tasks for Java/Scala exercises and 
solutions

This allows to quickly run the current state of the exercise or the solution
implementation from the command line / IDE.
---
 README.md | 19 ++-
 build.gradle  | 39 +++
 hourly-tips/build.gradle  |  7 ++-
 long-ride-alerts/build.gradle |  7 ++-
 ride-cleansing/build.gradle   |  7 ++-
 rides-and-fares/build.gradle  |  7 ++-
 6 files changed, 81 insertions(+), 5 deletions(-)

diff --git a/README.md b/README.md
index 368b904..834032a 100644
--- a/README.md
+++ b/README.md
@@ -176,7 +176,24 @@ Each of these exercises includes an `...Exercise` class 
with most of the necessa
 
 > **:information_source: Note:** As long as your `...Exercise` class is 
 > throwing a `MissingSolutionException`, the provided JUnit test classes will 
 > ignore that failure and verify the correctness of the solution 
 > implementation instead.
 
-There are Java and Scala versions of all the exercise, test, and solution 
classes.
+There are Java and Scala versions of all the exercise, test, and solution 
classes, each of which can be run from IntelliJ as usual.
+
+ Running Exercises, Tests, and Solutions on the Command Line
+
+You can execute exercises, solutions, and tests via `gradlew` from a CLI.
+
+- Tests can be executed as usual:
+
+```bash
+./gradlew test
+./gradlew ::test
+```
+
+- For Java/Scala exercises and solutions, we provide special tasks that are 
listed via
+
+```bash
+./gradlew printRunTasks
+```
 
 -
 
diff --git a/build.gradle b/build.gradle
index b13aaa9..3702325 100644
--- a/build.gradle
+++ b/build.gradle
@@ -94,6 +94,18 @@ subprojects {
 test.runtimeClasspath += configurations.shadow
 }
 
+project.plugins.withId('application') {
+['javaExerciseClassName', 'scalaExerciseClassName',
+ 'javaSolutionClassName', 'scalaSolutionClassName'].each { property ->
+if (project.ext.has(property)) {
+project.tasks.create(name: 
classNamePropertyToTaskName(property), type: JavaExec) {
+classpath = project.sourceSets.main.runtimeClasspath
+mainClass = project.ext.get(property)
+group = 'flink-training'
+}
+}
+}
+}
 
 jar {
 manifest {
@@ -119,3 +131,30 @@ subprojects {
 
 assemble.dependsOn(shadowJar)
 }
+
+tasks.register('printRunTasks') {
+println ''
+println 'Flink Training Tasks runnable from root project \'' + 
project.name + '\''
+println ''
+
+subprojects.findAll { project ->
+boolean first = true;
+project.tasks.withType(JavaExec) { task ->
+if (task.group == 'flink-training') {
+if (first) {
+println ''
+println '> Subproject \'' + project.name + '\''
+first = false;
+}
+println './gradlew :' + project.name + ':' + task.name
+}
+}
+}
+}
+
+static def classNamePropertyToTaskName(String property) {
+return 'run' +
+property.charAt(0).toString().toUpperCase() +
+property.substring(1, property.lastIndexOf('ClassName'))
+
+}
diff --git a/hourly-tips/build.gradle b/hourly-tips/build.gradle
index 75e3e60..20bcd13 100644
--- a/hourly-tips/build.gradle
+++ b/hourly-tips/build.gradle
@@ -1,3 +1,8 @@
+ext.javaExerciseClassName = 
'org.apache.flink.training.exercises.hourlytips.HourlyTipsExercise'
+ext.scalaExerciseClassName = 
'org.apache.flink.training.exercises.hourlytips.scala.HourlyTipsExercise'
+ext.javaSolutionClassName = 
'org.apache.flink.training.solutions.hourlytips.HourlyTipsSolution'
+ext.scalaSolutionClassName = 
'org.apache.flink.training.solutions.hourlytips.scala.HourlyTipsSolution'
+
 apply plugin: 'application'
 
-mainClassName = 
'org.apache.flink.training.exercises.hourlytips.HourlyTipsExercise'
+mainClassName = ext.javaExerciseClassName
diff --git a/long-ride-alerts/build.gradle b/long-ride-alerts/build.gradle
index 59b9448..8036a4f 100644
--- a/long-ride-alerts/build.gradle
+++ b/long-ride-alerts/build.gradle
@@ -1,3 +1,8 @@
+e

[flink-training] branch master updated: [FLINK-23337][gradle] Properly use the 'shadow' plugin

2021-07-13 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/master by this push:
 new 453d65a  [FLINK-23337][gradle] Properly use the 'shadow' plugin
453d65a is described below

commit 453d65a3e0bfe244427a6e025a1798c677073289
Author: Nico Kruber 
AuthorDate: Fri Jul 2 17:10:03 2021 +0200

[FLINK-23337][gradle] Properly use the 'shadow' plugin

This removes the need for the custom `flinkShadowJar` configuration and 
instead
defines dependencies with the default ways that the 'shadow' plugin offers.
---
 build.gradle| 69 ++---
 common/build.gradle |  8 ++-
 2 files changed, 30 insertions(+), 47 deletions(-)

diff --git a/build.gradle b/build.gradle
index 6077960..b13aaa9 100644
--- a/build.gradle
+++ b/build.gradle
@@ -60,62 +60,38 @@ subprojects {
 }
 }
 
-// NOTE: We cannot use "compileOnly" or "shadow" configurations since then 
we could not run code
-// in the IDE or with "gradle run". We also cannot exclude transitive 
dependencies from the
-// shadowJar yet (see https://github.com/johnrengelman/shadow/issues/159).
-// -> Explicitly define the // libraries we want to be included in the 
"flinkShadowJar" configuration!
-configurations {
-flinkShadowJar // dependencies which go into the shadowJar
-
-// provided by Flink
-flinkShadowJar.exclude group: 'org.apache.flink', module: 
'force-shading'
-flinkShadowJar.exclude group: 'com.google.code.findbugs', module: 
'jsr305'
-flinkShadowJar.exclude group: 'org.slf4j'
-flinkShadowJar.exclude group: 'log4j'
-flinkShadowJar.exclude group: 'org.apache.logging.log4j', module: 
'log4j-to-slf4j'
-
-// already provided dependencies from serializer frameworks
-flinkShadowJar.exclude group: 'com.esotericsoftware.kryo', module: 
'kryo'
-flinkShadowJar.exclude group: 'javax.servlet', module: 'servlet-api'
-flinkShadowJar.exclude group: 'org.apache.httpcomponents', module: 
'httpclient'
-}
-
 // common set of dependencies
 dependencies {
-implementation 
"org.apache.logging.log4j:log4j-slf4j-impl:${log4jVersion}"
-implementation "org.apache.logging.log4j:log4j-api:${log4jVersion}"
-implementation "org.apache.logging.log4j:log4j-core:${log4jVersion}"
+shadow "org.apache.logging.log4j:log4j-slf4j-impl:${log4jVersion}"
+shadow "org.apache.logging.log4j:log4j-api:${log4jVersion}"
+shadow "org.apache.logging.log4j:log4j-core:${log4jVersion}"
+
+shadow 
"org.apache.flink:flink-clients_${scalaBinaryVersion}:${flinkVersion}"
+shadow "org.apache.flink:flink-java:${flinkVersion}"
+shadow 
"org.apache.flink:flink-streaming-java_${scalaBinaryVersion}:${flinkVersion}"
+shadow 
"org.apache.flink:flink-streaming-scala_${scalaBinaryVersion}:${flinkVersion}"
 
 if (project != project(":common")) {
 implementation project(path: ':common')
-// transitive dependencies for flinkShadowJar need to be defined 
above
-// (the alternative of using configuration: 'shadow' does not work 
there because that adds a dependency on
-// the jar file, not the sources)
-flinkShadowJar project(path: ':common', transitive: false)
-
 testImplementation(project(":common")) {
 capabilities { requireCapability("$group:common-test") }
 }
 }
 }
 
-// make flinkShadowJar dependencies available:
+// add solution source dirs:
 sourceSets {
 main.java.srcDirs += 'src/solution/java'
 main.scala.srcDirs += 'src/solution/scala'
-main.compileClasspath += configurations.flinkShadowJar
-main.runtimeClasspath += configurations.flinkShadowJar
 
-test.compileClasspath += configurations.flinkShadowJar
-test.runtimeClasspath += configurations.flinkShadowJar
+// Add shadow configuration to runtime class path so that the
+// dynamically-generated tasks by IntelliJ are able to run and have
+// all dependencies they need. (Luckily, this does not influence what
+// ends up in the final shadowJar.)
+main.runtimeClasspath += configurations.shadow
 
-javadoc.classpath += configurations.flinkShadowJar
-}
-
-eclipse {
-classpath {
-plusConfigurations += [configurations.flinkShadowJar]
-}
+test.compileClasspath += configurations.shadow
+test.runtimeClasspath += configurations.shadow
 }
 
 
@@ -127,7 +103,18 @@ subprojects {
   

[flink-training] branch master updated: [FLINK-23336][log4j] update Log4J to 2.12.1 as used by Flink 1.13

2021-07-13 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/master by this push:
 new 7c323f4  [FLINK-23336][log4j] update Log4J to 2.12.1 as used by Flink 
1.13
7c323f4 is described below

commit 7c323f4bb0a4659a57498c099dbe900ba023d43d
Author: Nico Kruber 
AuthorDate: Fri Jul 2 11:42:41 2021 +0200

[FLINK-23336][log4j] update Log4J to 2.12.1 as used by Flink 1.13
---
 build.gradle|  9 +
 .../src/main/resources/log4j2.properties| 13 +++--
 .../src/main/resources/log4j2.properties| 13 +++--
 .../src/main/resources/log4j2.properties| 13 +++--
 .../src/main/resources/log4j2.properties| 13 +++--
 .../checkpointing/src/main/resources/log4j2.properties  | 13 +++--
 .../src/main/resources/log4j2.properties| 13 +++--
 .../introduction/src/main/resources/log4j2.properties   | 13 +++--
 .../object-reuse/src/main/resources/log4j2.properties   | 13 +++--
 .../exercise/src/main/resources/log4j2.properties   | 13 +++--
 .../src/main/resources/log4j2.properties| 13 +++--
 .../solution/src/main/resources/log4j2.properties   | 13 +++--
 .../throughput/src/main/resources/log4j2.properties | 13 +++--
 13 files changed, 89 insertions(+), 76 deletions(-)

diff --git a/build.gradle b/build.gradle
index 5d0c8e9..6077960 100644
--- a/build.gradle
+++ b/build.gradle
@@ -37,8 +37,7 @@ subprojects {
 javaVersion = '1.8'
 flinkVersion = '1.13.1'
 scalaBinaryVersion = '2.12'
-slf4jVersion = '1.7.15'
-log4jVersion = '1.2.17'
+log4jVersion = '2.12.1'
 junitVersion = '4.12'
 }
 
@@ -73,6 +72,7 @@ subprojects {
 flinkShadowJar.exclude group: 'com.google.code.findbugs', module: 
'jsr305'
 flinkShadowJar.exclude group: 'org.slf4j'
 flinkShadowJar.exclude group: 'log4j'
+flinkShadowJar.exclude group: 'org.apache.logging.log4j', module: 
'log4j-to-slf4j'
 
 // already provided dependencies from serializer frameworks
 flinkShadowJar.exclude group: 'com.esotericsoftware.kryo', module: 
'kryo'
@@ -82,8 +82,9 @@ subprojects {
 
 // common set of dependencies
 dependencies {
-implementation "log4j:log4j:${log4jVersion}"
-implementation "org.slf4j:slf4j-log4j12:${slf4jVersion}"
+implementation 
"org.apache.logging.log4j:log4j-slf4j-impl:${log4jVersion}"
+implementation "org.apache.logging.log4j:log4j-api:${log4jVersion}"
+implementation "org.apache.logging.log4j:log4j-core:${log4jVersion}"
 
 if (project != project(":common")) {
 implementation project(path: ':common')
diff --git a/rides-and-fares/src/main/resources/log4j.properties 
b/hourly-tips/src/main/resources/log4j2.properties
similarity index 78%
copy from rides-and-fares/src/main/resources/log4j.properties
copy to hourly-tips/src/main/resources/log4j2.properties
index da32ea0..8319d24 100644
--- a/rides-and-fares/src/main/resources/log4j.properties
+++ b/hourly-tips/src/main/resources/log4j2.properties
@@ -15,9 +15,10 @@
 #  See the License for the specific language governing permissions and
 # limitations under the License.
 

-
-log4j.rootLogger=INFO, console
-
-log4j.appender.console=org.apache.log4j.ConsoleAppender
-log4j.appender.console.layout=org.apache.log4j.PatternLayout
-log4j.appender.console.layout.ConversionPattern=%d{HH:mm:ss,SSS} %-5p %-60c %x 
- %m%n
+loogers=rootLooger
+appender.console.type=Console
+appender.console.name=STDOUT
+appender.console.layout.type=PatternLayout
+appender.console.layout.pattern=%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - 
%msg%n
+rootLogger.level=INFO
+rootLogger.appenderRef.console.ref=STDOUT
\ No newline at end of file
diff --git a/hourly-tips/src/main/resources/log4j.properties 
b/long-ride-alerts/src/main/resources/log4j2.properties
similarity index 78%
rename from hourly-tips/src/main/resources/log4j.properties
rename to long-ride-alerts/src/main/resources/log4j2.properties
index da32ea0..8319d24 100644
--- a/hourly-tips/src/main/resources/log4j.properties
+++ b/long-ride-alerts/src/main/resources/log4j2.properties
@@ -15,9 +15,10 @@
 #  See the License for the specific language governing permissions and
 # limitations under the License.
 

-
-log4j.rootLogger=INFO, console
-
-log4j.appender.console=org.apache.log4j.ConsoleAppender
-

[flink-training] branch master updated: [FLINK-23334][gradle] let the subprojects decide whether they implement an application

2021-07-13 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/master by this push:
 new 63d360e  [FLINK-23334][gradle] let the subprojects decide whether they 
implement an application
63d360e is described below

commit 63d360e49368041a4bbdd320a5c4b3600961dab2
Author: Nico Kruber 
AuthorDate: Fri Jul 2 11:37:19 2021 +0200

[FLINK-23334][gradle] let the subprojects decide whether they implement an 
application

This gives a cleaner gradle project file layout.
---
 build.gradle  | 7 ---
 hourly-tips/build.gradle  | 2 ++
 long-ride-alerts/build.gradle | 2 ++
 ride-cleansing/build.gradle   | 2 ++
 rides-and-fares/build.gradle  | 2 ++
 5 files changed, 8 insertions(+), 7 deletions(-)

diff --git a/build.gradle b/build.gradle
index 7697737..5d0c8e9 100644
--- a/build.gradle
+++ b/build.gradle
@@ -29,9 +29,6 @@ allprojects {
 subprojects {
 apply plugin: 'java'
 apply plugin: 'scala' // optional; uncomment if needed
-if (project != project(":common")) {
-apply plugin: 'application'
-}
 apply plugin: 'com.github.johnrengelman.shadow'
 apply plugin: 'checkstyle'
 apply plugin: 'eclipse'
@@ -120,10 +117,6 @@ subprojects {
 }
 }
 
-if (plugins.findPlugin('application')) {
-applicationDefaultJvmArgs = ["-Dlog4j.configuration=log4j.properties"]
-run.classpath = sourceSets.main.runtimeClasspath
-}
 
 jar {
 manifest {
diff --git a/hourly-tips/build.gradle b/hourly-tips/build.gradle
index 11df511..75e3e60 100644
--- a/hourly-tips/build.gradle
+++ b/hourly-tips/build.gradle
@@ -1 +1,3 @@
+apply plugin: 'application'
+
 mainClassName = 
'org.apache.flink.training.exercises.hourlytips.HourlyTipsExercise'
diff --git a/long-ride-alerts/build.gradle b/long-ride-alerts/build.gradle
index 62e4ffd..59b9448 100644
--- a/long-ride-alerts/build.gradle
+++ b/long-ride-alerts/build.gradle
@@ -1 +1,3 @@
+apply plugin: 'application'
+
 mainClassName = 
'org.apache.flink.training.exercises.longrides.LongRidesExercise'
diff --git a/ride-cleansing/build.gradle b/ride-cleansing/build.gradle
index 106e3b9..1f03917 100644
--- a/ride-cleansing/build.gradle
+++ b/ride-cleansing/build.gradle
@@ -1 +1,3 @@
+apply plugin: 'application'
+
 mainClassName = 
'org.apache.flink.training.exercises.ridecleansing.RideCleansingExercise'
diff --git a/rides-and-fares/build.gradle b/rides-and-fares/build.gradle
index 18366af..565aec6 100644
--- a/rides-and-fares/build.gradle
+++ b/rides-and-fares/build.gradle
@@ -1 +1,3 @@
+apply plugin: 'application'
+
 mainClassName = 
'org.apache.flink.training.exercises.ridesandfares.RidesAndFaresExercise'


[flink-training] 02/03: [hotfix][gradle] update build-scan plugin version

2021-07-13 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git

commit 1504a6477abc465775a39d9073b184e36659225d
Author: Nico Kruber 
AuthorDate: Fri Jul 2 11:17:24 2021 +0200

[hotfix][gradle] update build-scan plugin version
---
 settings.gradle | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/settings.gradle b/settings.gradle
index 24ea327..5e98871 100644
--- a/settings.gradle
+++ b/settings.gradle
@@ -1,5 +1,5 @@
 plugins {
-id "com.gradle.enterprise" version "3.2.1"
+id "com.gradle.enterprise" version "3.6.3"
 }
 
 rootProject.name = 'flink-training'


[flink-training] 01/03: [FLINK-23332][gradle] update Gradle version

2021-07-13 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git

commit 82620e098fa39c71538144b7f6a115163b071ccf
Author: Nico Kruber 
AuthorDate: Fri Jul 2 11:09:02 2021 +0200

[FLINK-23332][gradle] update Gradle version

This also requires a newer 'shadow' version.
---
 build.gradle | 2 +-
 gradle/wrapper/gradle-wrapper.properties | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/build.gradle b/build.gradle
index b516b5d..eb42442 100644
--- a/build.gradle
+++ b/build.gradle
@@ -16,7 +16,7 @@
  */
 
 plugins {
-id 'com.github.johnrengelman.shadow' version '5.2.0' apply false
+id 'com.github.johnrengelman.shadow' version '7.0.0' apply false
 }
 
 subprojects {
diff --git a/gradle/wrapper/gradle-wrapper.properties 
b/gradle/wrapper/gradle-wrapper.properties
index 6254d2d..69a9715 100644
--- a/gradle/wrapper/gradle-wrapper.properties
+++ b/gradle/wrapper/gradle-wrapper.properties
@@ -1,5 +1,5 @@
 distributionBase=GRADLE_USER_HOME
 distributionPath=wrapper/dists
-distributionUrl=https\://services.gradle.org/distributions/gradle-6.2.1-all.zip
+distributionUrl=https\://services.gradle.org/distributions/gradle-7.1-bin.zip
 zipStoreBase=GRADLE_USER_HOME
 zipStorePath=wrapper/dists


[flink-training] branch master updated (dd7be31 -> 5c8386d)

2021-07-13 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git.


from dd7be31  [FLINK-22868] Update to use Flink 1.13.1
 new 82620e0  [FLINK-23332][gradle] update Gradle version
 new 1504a64  [hotfix][gradle] update build-scan plugin version
 new 5c8386d  [hotfix][gradle] set project properties in a central place

The 3 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 build.gradle | 14 --
 gradle/wrapper/gradle-wrapper.properties |  2 +-
 settings.gradle  |  2 +-
 3 files changed, 10 insertions(+), 8 deletions(-)


[flink-training] 03/03: [hotfix][gradle] set project properties in a central place

2021-07-13 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git

commit 5c8386d022719f72fbb161c6727c0c2e2a3c12f1
Author: Nico Kruber 
AuthorDate: Fri Jul 2 11:36:27 2021 +0200

[hotfix][gradle] set project properties in a central place

Also, we actually only have the root project's description, so let's not set
it for all subprojects!
---
 build.gradle | 12 +++-
 1 file changed, 7 insertions(+), 5 deletions(-)

diff --git a/build.gradle b/build.gradle
index eb42442..7697737 100644
--- a/build.gradle
+++ b/build.gradle
@@ -19,6 +19,13 @@ plugins {
 id 'com.github.johnrengelman.shadow' version '7.0.0' apply false
 }
 
+description = "Flink Training Exercises"
+
+allprojects {
+group = 'org.apache.flink'
+version = '1.13-SNAPSHOT'
+}
+
 subprojects {
 apply plugin: 'java'
 apply plugin: 'scala' // optional; uncomment if needed
@@ -29,11 +36,6 @@ subprojects {
 apply plugin: 'checkstyle'
 apply plugin: 'eclipse'
 
-// artifact properties
-group = 'org.apache.flink'
-version = '1.13-SNAPSHOT'
-description = """Flink Training Exercises"""
-
 ext {
 javaVersion = '1.8'
 flinkVersion = '1.13.1'


[flink-jira-bot] 01/01: [hotfix] add missing "days" after number of days

2021-05-31 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch NicoK-patch-1
in repository https://gitbox.apache.org/repos/asf/flink-jira-bot.git

commit 1f593d9718b4d667de356ab3414767e05dd1ec3b
Author: Nico Kruber 
AuthorDate: Mon May 31 13:52:51 2021 +0200

[hotfix] add missing "days" after number of days
---
 config.yaml | 12 ++--
 1 file changed, 6 insertions(+), 6 deletions(-)

diff --git a/config.yaml b/config.yaml
index f8aa8cf..5c949a7 100644
--- a/config.yaml
+++ b/config.yaml
@@ -23,12 +23,12 @@ stale_assigned:
 warning_label: "stale-assigned"
 done_label: "auto-unassigned"
 warning_comment: |
-I am the [Flink Jira Bot|https://github.com/apache/flink-jira-bot/] 
and I help the community manage its development. I see this issue is assigned 
but has not received an update in {stale_days}, so it has been labeled 
"{warning_label}".
+I am the [Flink Jira Bot|https://github.com/apache/flink-jira-bot/] 
and I help the community manage its development. I see this issue is assigned 
but has not received an update in {stale_days} days, so it has been labeled 
"{warning_label}".
 If you are still working on the issue, please remove the label and add 
a comment updating the community on your progress.  If this issue is waiting on 
feedback, please consider this a reminder to the committer/reviewer. Flink is a 
very active project, and so we appreciate your patience.
 If you are no longer working on the issue, please unassign yourself so 
someone else may work on it. If the "warning_label" label is not removed in 
{warning_days} days, the issue will be automatically unassigned.
 
 done_comment: |
-This issue was marked "{warning_label}" {warning_days} ago and has not 
received an update. I have automatically removed the current assignee from the 
issue so others in the community may pick it up. If you are still working on 
this ticket, please ask a committer to reassign you and provide an update about 
your current status.
+This issue was marked "{warning_label}" {warning_days} days ago and 
has not received an update. I have automatically removed the current assignee 
from the issue so others in the community may pick it up. If you are still 
working on this ticket, please ask a committer to reassign you and provide an 
update about your current status.
 
 stale_minor:
 ticket_limit: 10
@@ -40,7 +40,7 @@ stale_minor:
 I am the [Flink Jira Bot|https://github.com/apache/flink-jira-bot/] 
and I help the community manage its development. I noticed that neither this 
issue nor its subtasks had updates for {stale_days} days, so I labeled it 
"{warning_label}".  If you are still affected by this bug or are still 
interested in this issue, please update and remove the label.
 
 done_comment: |
-This issue was labeled "{warning_label}" {warning_days} ago and has 
not received any updates so I have gone ahead and closed it.  If you are still 
affected by this or would like to raise the priority of this ticket please 
re-open, removing the label "{done_label}" and raise the ticket priority 
accordingly.
+This issue was labeled "{warning_label}" {warning_days} days ago and 
has not received any updates so I have gone ahead and closed it.  If you are 
still affected by this or would like to raise the priority of this ticket 
please re-open, removing the label "{done_label}" and raise the ticket priority 
accordingly.
 
 stale_blocker:
 ticket_limit: 5
@@ -52,7 +52,7 @@ stale_blocker:
 I am the [Flink Jira Bot|https://github.com/apache/flink-jira-bot/] 
and I help the community manage its development. I see this issues has been 
marked as a Blocker but is unassigned and neither itself nor its Sub-Tasks have 
been updated for {stale_days} days. I have gone ahead and marked it 
"{warning_label}". If this ticket is a Blocker, please either assign yourself 
or give an update. Afterwards, please remove the label or in {warning_days} 
days the issue will be deprioritized.
 
 done_comment: |
-This issue was labeled "{warning_label}" {warning_days} ago and has 
not received any updates so it is being deprioritized. If this ticket is 
actually a Blocker, please raise the priority and ask a committer to assign you 
the issue or revive the public discussion.
+This issue was labeled "{warning_label}" {warning_days} days ago and 
has not received any updates so it is being deprioritized. If this ticket is 
actually a Blocker, please raise the priority and ask a committer to assign you 
the issue or revive the public discussion.
 
 stale_critical:
 ticket_limit: 10
@@ -64,7 +64,7 @@ stale_critical:
 I am the [Flink Jira Bot|https://github.com/apache/flink-jira-bot/] 
and I hel

[flink-jira-bot] branch NicoK-patch-1 created (now 1f593d9)

2021-05-31 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch NicoK-patch-1
in repository https://gitbox.apache.org/repos/asf/flink-jira-bot.git.


  at 1f593d9  [hotfix] add missing "days" after number of days

This branch includes the following new commits:

 new 1f593d9  [hotfix] add missing "days" after number of days

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.



[flink] branch master updated (1e71b1c -> 0009737)

2021-03-04 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 1e71b1c  [FLINK-21552][runtime] Unreserve managed memory if 
OpaqueMemoryResource cannot be initialized.
 add 0009737  [hotfix][docs] wrong brackets in CREATE VIEW statement

No new revisions were added by this update.

Summary of changes:
 docs/content/docs/dev/table/sql/create.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[flink] 01/01: [hotfix][docs] wrong brackets in CREATE VIEW statement

2021-03-03 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch NicoK-hotfix-docs-create-view
in repository https://gitbox.apache.org/repos/asf/flink.git

commit b8eab5f240b1dabeab626fbebde24b6eaddc6e0c
Author: Nico Kruber 
AuthorDate: Wed Mar 3 13:32:06 2021 +0100

[hotfix][docs] wrong brackets in CREATE VIEW statement
---
 docs/content/docs/dev/table/sql/create.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/content/docs/dev/table/sql/create.md 
b/docs/content/docs/dev/table/sql/create.md
index 0e7e2e9..9113ba7 100644
--- a/docs/content/docs/dev/table/sql/create.md
+++ b/docs/content/docs/dev/table/sql/create.md
@@ -559,7 +559,7 @@ The key and value of expression `key1=val1` should both be 
string literal.
 ## CREATE VIEW
 ```sql
 CREATE [TEMPORARY] VIEW [IF NOT EXISTS] [catalog_name.][db_name.]view_name
-  [{columnName [, columnName ]* }] [COMMENT view_comment]
+  [( columnName [, columnName ]* )] [COMMENT view_comment]
   AS query_expression
 ```
 



[flink] branch NicoK-hotfix-docs-create-view created (now b8eab5f)

2021-03-03 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch NicoK-hotfix-docs-create-view
in repository https://gitbox.apache.org/repos/asf/flink.git.


  at b8eab5f  [hotfix][docs] wrong brackets in CREATE VIEW statement

This branch includes the following new commits:

 new b8eab5f  [hotfix][docs] wrong brackets in CREATE VIEW statement

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.




[flink] branch release-1.10 updated (10f155e -> a14665d)

2020-11-16 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch release-1.10
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 10f155e  [BP-1.10][FLINK-20013][network] BoundedBlockingSubpartition 
may leak network buffer if task is failed or canceled
 add a14665d  [FLINK-16753] Use CheckpointException to wrap exceptions 
thrown from StreamTask (#14071)

No new revisions were added by this update.

Summary of changes:
 .../checkpoint/CheckpointFailureManager.java   |  1 +
 .../checkpoint/CheckpointFailureReason.java|  2 +
 .../flink/streaming/runtime/tasks/StreamTask.java  |  4 +-
 .../streaming/runtime/tasks/StreamTaskTest.java| 66 ++
 4 files changed, 72 insertions(+), 1 deletion(-)



[flink] branch release-1.10 updated (10f155e -> a14665d)

2020-11-16 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch release-1.10
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 10f155e  [BP-1.10][FLINK-20013][network] BoundedBlockingSubpartition 
may leak network buffer if task is failed or canceled
 add a14665d  [FLINK-16753] Use CheckpointException to wrap exceptions 
thrown from StreamTask (#14071)

No new revisions were added by this update.

Summary of changes:
 .../checkpoint/CheckpointFailureManager.java   |  1 +
 .../checkpoint/CheckpointFailureReason.java|  2 +
 .../flink/streaming/runtime/tasks/StreamTask.java  |  4 +-
 .../streaming/runtime/tasks/StreamTaskTest.java| 66 ++
 4 files changed, 72 insertions(+), 1 deletion(-)



[flink] branch release-1.11 updated (c24185d -> bc9f0fc)

2020-11-16 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch release-1.11
in repository https://gitbox.apache.org/repos/asf/flink.git.


from c24185d  [hotfix] Adjust japicmp exclusion to permit adding methods to 
SourceReaderContext and SplitEnumeratorContext interfaces
 add bc9f0fc  [FLINK-16753][checkpointing] Use CheckpointException to wrap 
exceptions thrown from AsyncCheckpointRunnable (#14072)

No new revisions were added by this update.

Summary of changes:
 .../checkpoint/CheckpointFailureManager.java   |   1 +
 .../checkpoint/CheckpointFailureReason.java|   2 +
 .../runtime/tasks/AsyncCheckpointRunnable.java |   6 +-
 .../runtime/tasks/AsyncCheckpointRunnableTest.java | 112 +
 4 files changed, 120 insertions(+), 1 deletion(-)
 create mode 100644 
flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/AsyncCheckpointRunnableTest.java



[flink] branch release-1.10 updated (10f155e -> a14665d)

2020-11-16 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch release-1.10
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 10f155e  [BP-1.10][FLINK-20013][network] BoundedBlockingSubpartition 
may leak network buffer if task is failed or canceled
 add a14665d  [FLINK-16753] Use CheckpointException to wrap exceptions 
thrown from StreamTask (#14071)

No new revisions were added by this update.

Summary of changes:
 .../checkpoint/CheckpointFailureManager.java   |  1 +
 .../checkpoint/CheckpointFailureReason.java|  2 +
 .../flink/streaming/runtime/tasks/StreamTask.java  |  4 +-
 .../streaming/runtime/tasks/StreamTaskTest.java| 66 ++
 4 files changed, 72 insertions(+), 1 deletion(-)



[flink] branch release-1.11 updated (c24185d -> bc9f0fc)

2020-11-16 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch release-1.11
in repository https://gitbox.apache.org/repos/asf/flink.git.


from c24185d  [hotfix] Adjust japicmp exclusion to permit adding methods to 
SourceReaderContext and SplitEnumeratorContext interfaces
 add bc9f0fc  [FLINK-16753][checkpointing] Use CheckpointException to wrap 
exceptions thrown from AsyncCheckpointRunnable (#14072)

No new revisions were added by this update.

Summary of changes:
 .../checkpoint/CheckpointFailureManager.java   |   1 +
 .../checkpoint/CheckpointFailureReason.java|   2 +
 .../runtime/tasks/AsyncCheckpointRunnable.java |   6 +-
 .../runtime/tasks/AsyncCheckpointRunnableTest.java | 112 +
 4 files changed, 120 insertions(+), 1 deletion(-)
 create mode 100644 
flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/AsyncCheckpointRunnableTest.java



[flink] branch release-1.10 updated (10f155e -> a14665d)

2020-11-16 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch release-1.10
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 10f155e  [BP-1.10][FLINK-20013][network] BoundedBlockingSubpartition 
may leak network buffer if task is failed or canceled
 add a14665d  [FLINK-16753] Use CheckpointException to wrap exceptions 
thrown from StreamTask (#14071)

No new revisions were added by this update.

Summary of changes:
 .../checkpoint/CheckpointFailureManager.java   |  1 +
 .../checkpoint/CheckpointFailureReason.java|  2 +
 .../flink/streaming/runtime/tasks/StreamTask.java  |  4 +-
 .../streaming/runtime/tasks/StreamTaskTest.java| 66 ++
 4 files changed, 72 insertions(+), 1 deletion(-)



[flink] branch release-1.11 updated (c24185d -> bc9f0fc)

2020-11-16 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch release-1.11
in repository https://gitbox.apache.org/repos/asf/flink.git.


from c24185d  [hotfix] Adjust japicmp exclusion to permit adding methods to 
SourceReaderContext and SplitEnumeratorContext interfaces
 add bc9f0fc  [FLINK-16753][checkpointing] Use CheckpointException to wrap 
exceptions thrown from AsyncCheckpointRunnable (#14072)

No new revisions were added by this update.

Summary of changes:
 .../checkpoint/CheckpointFailureManager.java   |   1 +
 .../checkpoint/CheckpointFailureReason.java|   2 +
 .../runtime/tasks/AsyncCheckpointRunnable.java |   6 +-
 .../runtime/tasks/AsyncCheckpointRunnableTest.java | 112 +
 4 files changed, 120 insertions(+), 1 deletion(-)
 create mode 100644 
flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/AsyncCheckpointRunnableTest.java



[flink] branch release-1.10 updated (10f155e -> a14665d)

2020-11-16 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch release-1.10
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 10f155e  [BP-1.10][FLINK-20013][network] BoundedBlockingSubpartition 
may leak network buffer if task is failed or canceled
 add a14665d  [FLINK-16753] Use CheckpointException to wrap exceptions 
thrown from StreamTask (#14071)

No new revisions were added by this update.

Summary of changes:
 .../checkpoint/CheckpointFailureManager.java   |  1 +
 .../checkpoint/CheckpointFailureReason.java|  2 +
 .../flink/streaming/runtime/tasks/StreamTask.java  |  4 +-
 .../streaming/runtime/tasks/StreamTaskTest.java| 66 ++
 4 files changed, 72 insertions(+), 1 deletion(-)



[flink] branch release-1.11 updated: [FLINK-16753][checkpointing] Use CheckpointException to wrap exceptions thrown from AsyncCheckpointRunnable (#14072)

2020-11-16 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch release-1.11
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.11 by this push:
 new bc9f0fc  [FLINK-16753][checkpointing] Use CheckpointException to wrap 
exceptions thrown from AsyncCheckpointRunnable (#14072)
bc9f0fc is described below

commit bc9f0fc7d6a02211e1e30ecc425d2dbfe4a0bcb5
Author: Jiayi Liao 
AuthorDate: Mon Nov 16 17:20:56 2020 +0800

[FLINK-16753][checkpointing] Use CheckpointException to wrap exceptions 
thrown from AsyncCheckpointRunnable (#14072)
---
 .../checkpoint/CheckpointFailureManager.java   |   1 +
 .../checkpoint/CheckpointFailureReason.java|   2 +
 .../runtime/tasks/AsyncCheckpointRunnable.java |   6 +-
 .../runtime/tasks/AsyncCheckpointRunnableTest.java | 112 +
 4 files changed, 120 insertions(+), 1 deletion(-)

diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointFailureManager.java
 
b/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointFailureManager.java
index fbf5d98..ca8f42d 100644
--- 
a/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointFailureManager.java
+++ 
b/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointFailureManager.java
@@ -117,6 +117,7 @@ public class CheckpointFailureManager {
case CHECKPOINT_DECLINED_INPUT_END_OF_STREAM:
 
case EXCEPTION:
+   case CHECKPOINT_ASYNC_EXCEPTION:
case TASK_FAILURE:
case TASK_CHECKPOINT_FAILURE:
case UNKNOWN_TASK_CHECKPOINT_NOTIFICATION_FAILURE:
diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointFailureReason.java
 
b/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointFailureReason.java
index cd787d0..e20e57f 100644
--- 
a/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointFailureReason.java
+++ 
b/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointFailureReason.java
@@ -36,6 +36,8 @@ public enum CheckpointFailureReason {
 
EXCEPTION(true, "An Exception occurred while triggering the 
checkpoint."),
 
+   CHECKPOINT_ASYNC_EXCEPTION(false, "Asynchronous task checkpoint 
failed."),
+
CHECKPOINT_EXPIRED(false, "Checkpoint expired before completing."),
 
CHECKPOINT_SUBSUMED(false, "Checkpoint has been subsumed."),
diff --git 
a/flink-streaming-java/src/main/java/org/apache/flink/streaming/runtime/tasks/AsyncCheckpointRunnable.java
 
b/flink-streaming-java/src/main/java/org/apache/flink/streaming/runtime/tasks/AsyncCheckpointRunnable.java
index af10411..e5aeb68 100644
--- 
a/flink-streaming-java/src/main/java/org/apache/flink/streaming/runtime/tasks/AsyncCheckpointRunnable.java
+++ 
b/flink-streaming-java/src/main/java/org/apache/flink/streaming/runtime/tasks/AsyncCheckpointRunnable.java
@@ -18,6 +18,8 @@
 package org.apache.flink.streaming.runtime.tasks;
 
 import org.apache.flink.core.fs.FileSystemSafetyNet;
+import org.apache.flink.runtime.checkpoint.CheckpointException;
+import org.apache.flink.runtime.checkpoint.CheckpointFailureReason;
 import org.apache.flink.runtime.checkpoint.CheckpointMetaData;
 import org.apache.flink.runtime.checkpoint.CheckpointMetrics;
 import org.apache.flink.runtime.checkpoint.TaskStateSnapshot;
@@ -195,7 +197,9 @@ final class AsyncCheckpointRunnable implements Runnable, 
Closeable {
// We only report the exception for the 
original cause of fail and cleanup.
// Otherwise this followup exception could race 
the original exception in failing the task.
try {
-   
taskEnvironment.declineCheckpoint(checkpointMetaData.getCheckpointId(), 
checkpointException);
+   taskEnvironment.declineCheckpoint(
+   
checkpointMetaData.getCheckpointId(),
+   new 
CheckpointException(CheckpointFailureReason.CHECKPOINT_ASYNC_EXCEPTION, 
checkpointException));
} catch (Exception unhandled) {
AsynchronousException asyncException = 
new AsynchronousException(unhandled);

asyncExceptionHandler.handleAsyncException("Failure in asynchronous checkpoint 
materialization", asyncException);
diff --git 
a/flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/AsyncCheckpointRunnableTest.java
 
b/flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/AsyncCheckpointRunnableTes

[flink] branch release-1.11 updated: [FLINK-16753][checkpointing] Use CheckpointException to wrap exceptions thrown from AsyncCheckpointRunnable (#14072)

2020-11-16 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch release-1.11
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.11 by this push:
 new bc9f0fc  [FLINK-16753][checkpointing] Use CheckpointException to wrap 
exceptions thrown from AsyncCheckpointRunnable (#14072)
bc9f0fc is described below

commit bc9f0fc7d6a02211e1e30ecc425d2dbfe4a0bcb5
Author: Jiayi Liao 
AuthorDate: Mon Nov 16 17:20:56 2020 +0800

[FLINK-16753][checkpointing] Use CheckpointException to wrap exceptions 
thrown from AsyncCheckpointRunnable (#14072)
---
 .../checkpoint/CheckpointFailureManager.java   |   1 +
 .../checkpoint/CheckpointFailureReason.java|   2 +
 .../runtime/tasks/AsyncCheckpointRunnable.java |   6 +-
 .../runtime/tasks/AsyncCheckpointRunnableTest.java | 112 +
 4 files changed, 120 insertions(+), 1 deletion(-)

diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointFailureManager.java
 
b/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointFailureManager.java
index fbf5d98..ca8f42d 100644
--- 
a/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointFailureManager.java
+++ 
b/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointFailureManager.java
@@ -117,6 +117,7 @@ public class CheckpointFailureManager {
case CHECKPOINT_DECLINED_INPUT_END_OF_STREAM:
 
case EXCEPTION:
+   case CHECKPOINT_ASYNC_EXCEPTION:
case TASK_FAILURE:
case TASK_CHECKPOINT_FAILURE:
case UNKNOWN_TASK_CHECKPOINT_NOTIFICATION_FAILURE:
diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointFailureReason.java
 
b/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointFailureReason.java
index cd787d0..e20e57f 100644
--- 
a/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointFailureReason.java
+++ 
b/flink-runtime/src/main/java/org/apache/flink/runtime/checkpoint/CheckpointFailureReason.java
@@ -36,6 +36,8 @@ public enum CheckpointFailureReason {
 
EXCEPTION(true, "An Exception occurred while triggering the 
checkpoint."),
 
+   CHECKPOINT_ASYNC_EXCEPTION(false, "Asynchronous task checkpoint 
failed."),
+
CHECKPOINT_EXPIRED(false, "Checkpoint expired before completing."),
 
CHECKPOINT_SUBSUMED(false, "Checkpoint has been subsumed."),
diff --git 
a/flink-streaming-java/src/main/java/org/apache/flink/streaming/runtime/tasks/AsyncCheckpointRunnable.java
 
b/flink-streaming-java/src/main/java/org/apache/flink/streaming/runtime/tasks/AsyncCheckpointRunnable.java
index af10411..e5aeb68 100644
--- 
a/flink-streaming-java/src/main/java/org/apache/flink/streaming/runtime/tasks/AsyncCheckpointRunnable.java
+++ 
b/flink-streaming-java/src/main/java/org/apache/flink/streaming/runtime/tasks/AsyncCheckpointRunnable.java
@@ -18,6 +18,8 @@
 package org.apache.flink.streaming.runtime.tasks;
 
 import org.apache.flink.core.fs.FileSystemSafetyNet;
+import org.apache.flink.runtime.checkpoint.CheckpointException;
+import org.apache.flink.runtime.checkpoint.CheckpointFailureReason;
 import org.apache.flink.runtime.checkpoint.CheckpointMetaData;
 import org.apache.flink.runtime.checkpoint.CheckpointMetrics;
 import org.apache.flink.runtime.checkpoint.TaskStateSnapshot;
@@ -195,7 +197,9 @@ final class AsyncCheckpointRunnable implements Runnable, 
Closeable {
// We only report the exception for the 
original cause of fail and cleanup.
// Otherwise this followup exception could race 
the original exception in failing the task.
try {
-   
taskEnvironment.declineCheckpoint(checkpointMetaData.getCheckpointId(), 
checkpointException);
+   taskEnvironment.declineCheckpoint(
+   
checkpointMetaData.getCheckpointId(),
+   new 
CheckpointException(CheckpointFailureReason.CHECKPOINT_ASYNC_EXCEPTION, 
checkpointException));
} catch (Exception unhandled) {
AsynchronousException asyncException = 
new AsynchronousException(unhandled);

asyncExceptionHandler.handleAsyncException("Failure in asynchronous checkpoint 
materialization", asyncException);
diff --git 
a/flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/AsyncCheckpointRunnableTest.java
 
b/flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/tasks/AsyncCheckpointRunnableTes

[flink] branch master updated (8a63e64 -> 700a92d)

2020-11-10 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 8a63e64  [FLINK-20045][tests] Let TestingLeaderEelctionEventHandler 
wait until being initialized
 add 700a92d  [FLINK-19972][serialization] add more hints in case of 
incompatbilities

No new revisions were added by this update.

Summary of changes:
 .../flink/runtime/state/heap/HeapKeyedStateBackend.java| 14 --
 .../flink/runtime/state/heap/HeapRestoreOperation.java |  7 ++-
 .../contrib/streaming/state/RocksDBKeyedStateBackend.java  | 13 +++--
 .../state/restore/AbstractRocksDBRestoreOperation.java |  7 ++-
 4 files changed, 35 insertions(+), 6 deletions(-)



[flink] branch master updated (8a63e64 -> 700a92d)

2020-11-10 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 8a63e64  [FLINK-20045][tests] Let TestingLeaderEelctionEventHandler 
wait until being initialized
 add 700a92d  [FLINK-19972][serialization] add more hints in case of 
incompatbilities

No new revisions were added by this update.

Summary of changes:
 .../flink/runtime/state/heap/HeapKeyedStateBackend.java| 14 --
 .../flink/runtime/state/heap/HeapRestoreOperation.java |  7 ++-
 .../contrib/streaming/state/RocksDBKeyedStateBackend.java  | 13 +++--
 .../state/restore/AbstractRocksDBRestoreOperation.java |  7 ++-
 4 files changed, 35 insertions(+), 6 deletions(-)



[flink] branch master updated (8a63e64 -> 700a92d)

2020-11-10 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 8a63e64  [FLINK-20045][tests] Let TestingLeaderEelctionEventHandler 
wait until being initialized
 add 700a92d  [FLINK-19972][serialization] add more hints in case of 
incompatbilities

No new revisions were added by this update.

Summary of changes:
 .../flink/runtime/state/heap/HeapKeyedStateBackend.java| 14 --
 .../flink/runtime/state/heap/HeapRestoreOperation.java |  7 ++-
 .../contrib/streaming/state/RocksDBKeyedStateBackend.java  | 13 +++--
 .../state/restore/AbstractRocksDBRestoreOperation.java |  7 ++-
 4 files changed, 35 insertions(+), 6 deletions(-)



[flink] branch master updated (8a63e64 -> 700a92d)

2020-11-10 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 8a63e64  [FLINK-20045][tests] Let TestingLeaderEelctionEventHandler 
wait until being initialized
 add 700a92d  [FLINK-19972][serialization] add more hints in case of 
incompatbilities

No new revisions were added by this update.

Summary of changes:
 .../flink/runtime/state/heap/HeapKeyedStateBackend.java| 14 --
 .../flink/runtime/state/heap/HeapRestoreOperation.java |  7 ++-
 .../contrib/streaming/state/RocksDBKeyedStateBackend.java  | 13 +++--
 .../state/restore/AbstractRocksDBRestoreOperation.java |  7 ++-
 4 files changed, 35 insertions(+), 6 deletions(-)



[flink] branch master updated (8a63e64 -> 700a92d)

2020-11-10 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 8a63e64  [FLINK-20045][tests] Let TestingLeaderEelctionEventHandler 
wait until being initialized
 add 700a92d  [FLINK-19972][serialization] add more hints in case of 
incompatbilities

No new revisions were added by this update.

Summary of changes:
 .../flink/runtime/state/heap/HeapKeyedStateBackend.java| 14 --
 .../flink/runtime/state/heap/HeapRestoreOperation.java |  7 ++-
 .../contrib/streaming/state/RocksDBKeyedStateBackend.java  | 13 +++--
 .../state/restore/AbstractRocksDBRestoreOperation.java |  7 ++-
 4 files changed, 35 insertions(+), 6 deletions(-)



[flink-web] branch asf-site updated (734f4a1 -> 008e907)

2020-07-30 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git.


from 734f4a1  Rebuild website
 new 707dd32  Add Blogbost: Advanced Flink Application Patterns: Custom 
Window Processing
 new 008e907  Rebuild website

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 _posts/2020-07-30-demo-fraud-detection-3.md|  660 
 content/blog/feed.xml  | 1115 
 content/blog/index.html|   38 +-
 content/blog/page10/index.html |   38 +-
 content/blog/page11/index.html |   44 +-
 content/blog/page12/index.html |   45 +-
 content/blog/page13/index.html |   25 +
 content/blog/page2/index.html  |   40 +-
 content/blog/page3/index.html  |   38 +-
 content/blog/page4/index.html  |   36 +-
 content/blog/page5/index.html  |   38 +-
 content/blog/page6/index.html  |   40 +-
 content/blog/page7/index.html  |   38 +-
 content/blog/page8/index.html  |   36 +-
 content/blog/page9/index.html  |   37 +-
 .../img/blog/patterns-blog-3/evaluation-delays.png |  Bin 0 -> 29120 bytes
 .../blog/patterns-blog-3/keyed-state-scoping.png   |  Bin 0 -> 199113 bytes
 content/img/blog/patterns-blog-3/late-events.png   |  Bin 0 -> 20483 bytes
 .../img/blog/patterns-blog-3/pre-aggregation.png   |  Bin 0 -> 33817 bytes
 .../patterns-blog-3/sample-rule-definition.png |  Bin 0 -> 98413 bytes
 content/img/blog/patterns-blog-3/time-windows.png  |  Bin 0 -> 37632 bytes
 content/img/blog/patterns-blog-3/type-kryo.png |  Bin 0 -> 28294 bytes
 content/img/blog/patterns-blog-3/type-pojo.png |  Bin 0 -> 34853 bytes
 content/img/blog/patterns-blog-3/widest-window.png |  Bin 0 -> 90233 bytes
 .../img/blog/patterns-blog-3/window-clean-up.png   |  Bin 0 -> 15498 bytes
 content/index.html |   13 +-
 .../news/2020/07/30/demo-fraud-detection-3.html|  908 
 content/zh/index.html  |   13 +-
 img/blog/patterns-blog-3/evaluation-delays.png |  Bin 0 -> 29120 bytes
 img/blog/patterns-blog-3/keyed-state-scoping.png   |  Bin 0 -> 199113 bytes
 img/blog/patterns-blog-3/late-events.png   |  Bin 0 -> 20483 bytes
 img/blog/patterns-blog-3/pre-aggregation.png   |  Bin 0 -> 33817 bytes
 .../patterns-blog-3/sample-rule-definition.png |  Bin 0 -> 98413 bytes
 img/blog/patterns-blog-3/time-windows.png  |  Bin 0 -> 37632 bytes
 img/blog/patterns-blog-3/type-kryo.png |  Bin 0 -> 28294 bytes
 img/blog/patterns-blog-3/type-pojo.png |  Bin 0 -> 34853 bytes
 img/blog/patterns-blog-3/widest-window.png |  Bin 0 -> 90233 bytes
 img/blog/patterns-blog-3/window-clean-up.png   |  Bin 0 -> 15498 bytes
 38 files changed, 2791 insertions(+), 411 deletions(-)
 create mode 100644 _posts/2020-07-30-demo-fraud-detection-3.md
 create mode 100644 content/img/blog/patterns-blog-3/evaluation-delays.png
 create mode 100644 content/img/blog/patterns-blog-3/keyed-state-scoping.png
 create mode 100644 content/img/blog/patterns-blog-3/late-events.png
 create mode 100644 content/img/blog/patterns-blog-3/pre-aggregation.png
 create mode 100644 content/img/blog/patterns-blog-3/sample-rule-definition.png
 create mode 100644 content/img/blog/patterns-blog-3/time-windows.png
 create mode 100644 content/img/blog/patterns-blog-3/type-kryo.png
 create mode 100644 content/img/blog/patterns-blog-3/type-pojo.png
 create mode 100644 content/img/blog/patterns-blog-3/widest-window.png
 create mode 100644 content/img/blog/patterns-blog-3/window-clean-up.png
 create mode 100644 content/news/2020/07/30/demo-fraud-detection-3.html
 create mode 100644 img/blog/patterns-blog-3/evaluation-delays.png
 create mode 100644 img/blog/patterns-blog-3/keyed-state-scoping.png
 create mode 100644 img/blog/patterns-blog-3/late-events.png
 create mode 100644 img/blog/patterns-blog-3/pre-aggregation.png
 create mode 100644 img/blog/patterns-blog-3/sample-rule-definition.png
 create mode 100644 img/blog/patterns-blog-3/time-windows.png
 create mode 100644 img/blog/patterns-blog-3/type-kryo.png
 create mode 100644 img/blog/patterns-blog-3/type-pojo.png
 create mode 100644 img/blog/patterns-blog-3/widest-window.png
 create mode 100644 img/blog/patterns-blog-3/window-clean-up.png



[flink-web] 01/02: Add Blogbost: Advanced Flink Application Patterns: Custom Window Processing

2020-07-30 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit 707dd32128cf5a10c9ffda807db4e34694c96190
Author: Alexander Fedulov <1492164+afedu...@users.noreply.github.com>
AuthorDate: Wed Jul 22 14:01:40 2020 +0200

Add Blogbost: Advanced Flink Application Patterns: Custom Window Processing

This closes #362.
---
 _posts/2020-07-30-demo-fraud-detection-3.md| 660 +
 img/blog/patterns-blog-3/evaluation-delays.png | Bin 0 -> 29120 bytes
 img/blog/patterns-blog-3/keyed-state-scoping.png   | Bin 0 -> 199113 bytes
 img/blog/patterns-blog-3/late-events.png   | Bin 0 -> 20483 bytes
 img/blog/patterns-blog-3/pre-aggregation.png   | Bin 0 -> 33817 bytes
 .../patterns-blog-3/sample-rule-definition.png | Bin 0 -> 98413 bytes
 img/blog/patterns-blog-3/time-windows.png  | Bin 0 -> 37632 bytes
 img/blog/patterns-blog-3/type-kryo.png | Bin 0 -> 28294 bytes
 img/blog/patterns-blog-3/type-pojo.png | Bin 0 -> 34853 bytes
 img/blog/patterns-blog-3/widest-window.png | Bin 0 -> 90233 bytes
 img/blog/patterns-blog-3/window-clean-up.png   | Bin 0 -> 15498 bytes
 11 files changed, 660 insertions(+)

diff --git a/_posts/2020-07-30-demo-fraud-detection-3.md 
b/_posts/2020-07-30-demo-fraud-detection-3.md
new file mode 100644
index 000..a96ab03
--- /dev/null
+++ b/_posts/2020-07-30-demo-fraud-detection-3.md
@@ -0,0 +1,660 @@
+---
+layout: post
+title: "Advanced Flink Application Patterns Vol.3: Custom Window Processing"
+date: 2020-07-30T12:00:00.000Z
+authors:
+- alex:
+  name: "Alexander Fedulov"
+  twitter: "alex_fedulov"
+categories: news
+excerpt: In this series of blog posts you will learn about powerful Flink 
patterns for building streaming applications.
+---
+
+
+.tg  {border-collapse:collapse;border-spacing:0;}
+.tg td{padding:10px 
10px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;}
+.tg th{padding:10px 
10px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;background-color:#eff0f1;}
+.tg .tg-wide{padding:10px 30px;}
+.tg .tg-top{vertical-align:top}
+.tg .tg-topcenter{text-align:center;vertical-align:top}
+.tg .tg-center{text-align:center;vertical-align:center}
+
+
+## Introduction
+
+In the previous articles of the series, we described how you can achieve
+flexible stream partitioning based on dynamically-updated configurations
+(a set of fraud-detection rules) and how you can utilize Flink\'s
+Broadcast mechanism to distribute processing configuration at runtime
+among the relevant operators. 
+
+Following up directly where we left the discussion of the end-to-end
+solution last time, in this article we will describe how you can use the
+\"Swiss knife\" of Flink - the [*Process 
Function*](https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/stream/operators/process_function.html)
 to create an
+implementation that is tailor-made to match your streaming business
+logic requirements. Our discussion will continue in the context of the
+[Fraud Detection engine]({{ site.baseurl 
}}/news/2020/01/15/demo-fraud-detection.html#fraud-detection-demo). We will 
also demonstrate how you can
+implement your own **custom replacement for time windows** for cases
+where the out-of-the-box windowing available from the DataStream API
+does not satisfy your requirements. In particular, we will look at the
+trade-offs that you can make when designing a solution which requires
+low-latency reactions to individual events.
+
+This article will describe some high-level concepts that can be applied
+independently, but it is recommended that you review the material in
+[part one]({{ site.baseurl }}/news/2020/01/15/demo-fraud-detection.html) and
+[part two]({{ site.baseurl }}/news/2020/03/24/demo-fraud-detection-2.html) of 
the series as well as checkout the [code
+base](https://github.com/afedulov/fraud-detection-demo) in order to make
+it easier to follow along.
+
+## ProcessFunction as a "Window"
+
+### Low Latency
+
+Let's start with a reminder of the type of fraud detection rule that we
+would like to support:
+
+*"Whenever the **sum** of  **payments** from the same **payer** to the
+same **beneficiary** within **a 24 hour
+period** is **greater** than **200 000 \$** - trigger an alert."*
+
+In other words, given a stream of transactions partitioned by a key that
+combines the payer and the beneficiary fields, we would like to look
+back in time and determine, for each incoming transaction, if the sum of
+all previous payments between the two specific participants exceeds the
+defined threshold. In effect, the computation window is always moved
+along to the position of the last observed event for a particular data
+partitioning key.
+
+
+
+
+Figure 1: Time Windows

[flink-web] 01/02: Link blogposts

2020-07-28 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit 177fe4fbe3028b2b9f9ff00e56ce15665ef4a880
Author: Alexander Fedulov <1492164+afedu...@users.noreply.github.com>
AuthorDate: Thu Jul 2 14:38:40 2020 +0200

Link blogposts

This closes #354.
---
 _posts/2020-01-15-demo-fraud-detection.md | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/_posts/2020-01-15-demo-fraud-detection.md 
b/_posts/2020-01-15-demo-fraud-detection.md
index 96a3c27..291dde7 100644
--- a/_posts/2020-01-15-demo-fraud-detection.md
+++ b/_posts/2020-01-15-demo-fraud-detection.md
@@ -13,7 +13,7 @@ excerpt: In this series of blog posts you will learn about 
three powerful Flink
 
 In this series of blog posts you will learn about three powerful Flink 
patterns for building streaming applications:
 
- - Dynamic updates of application logic
+ - [Dynamic updates of application logic]({{ site.baseurl 
}}/news/2020/03/24/demo-fraud-detection-2.html)
  - Dynamic data partitioning (shuffle), controlled at runtime
  - Low latency alerting based on custom windowing logic (without using the 
window API)
 
@@ -219,4 +219,4 @@ In the second part of this series, we will describe how the 
rules make their way
 
 
 
-In the next article, we will see how Flink's broadcast streams can be utilized 
to help steer the processing within the Fraud Detection engine at runtime 
(Dynamic Application Updates pattern).
+In the [next article]({{ site.baseurl 
}}/news/2020/03/24/demo-fraud-detection-2.html), we will see how Flink's 
broadcast streams can be utilized to help steer the processing within the Fraud 
Detection engine at runtime (Dynamic Application Updates pattern).



[flink-web] 02/02: Rebuild website

2020-07-28 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit 55b6c7c4379ca3ba6dfca5b720c4aa167ab4f779
Author: Nico Kruber 
AuthorDate: Tue Jul 28 16:52:43 2020 +0200

Rebuild website
---
 content/blog/feed.xml | 6 +++---
 content/news/2020/01/15/demo-fraud-detection.html | 4 ++--
 2 files changed, 5 insertions(+), 5 deletions(-)

diff --git a/content/blog/feed.xml b/content/blog/feed.xml
index a77152d..4f96e80 100644
--- a/content/blog/feed.xml
+++ b/content/blog/feed.xml
@@ -13,7 +13,7 @@
 pIn the following sections, we describe how to integrate Kafka, MySQL, 
Elasticsearch, and Kibana with Flink SQL to analyze e-commerce user behavior in 
real-time. All exercises in this blogpost are performed in the Flink SQL CLI, 
and the entire process uses standard SQL syntax, without a single line of 
Java/Scala code or IDE installation. The final result of this demo is shown in 
the following figure:/p
 
 center
-img src=/img/blog/2020-05-03-flink-sql-demo/image1.gif 
width=650px alt=Demo Overview /
+img src=/img/blog/2020-07-28-flink-sql-demo/image1.gif 
width=650px alt=Demo Overview /
 /center
 pbr //p
 
@@ -5125,7 +5125,7 @@ However, you need to take care of another aspect, which 
is providing timestamps
 pIn this series of blog posts you will learn about three 
powerful Flink patterns for building streaming applications:/p
 
 ul
-  liDynamic updates of application logic/li
+  lia 
href=/news/2020/03/24/demo-fraud-detection-2.htmlDynamic 
updates of application logic/a/li
   liDynamic data partitioning (shuffle), controlled at 
runtime/li
   liLow latency alerting based on custom windowing logic (without 
using the window API)/li
 /ul
@@ -5325,7 +5325,7 @@ To understand why this is the case, let us start with 
articulating a realistic s
 /center
 pbr //p
 
-pIn the next article, we will see how Flink’s broadcast streams can be 
utilized to help steer the processing within the Fraud Detection engine at 
runtime (Dynamic Application Updates pattern)./p
+pIn the a 
href=/news/2020/03/24/demo-fraud-detection-2.htmlnext 
article/a, we will see how Flink’s broadcast streams can be utilized to 
help steer the processing within the Fraud Detection engine at runtime (Dynamic 
Application Updates pattern)./p
 
 Wed, 15 Jan 2020 13:00:00 +0100
 https://flink.apache.org/news/2020/01/15/demo-fraud-detection.html
diff --git a/content/news/2020/01/15/demo-fraud-detection.html 
b/content/news/2020/01/15/demo-fraud-detection.html
index 22fe277..dcb51b4 100644
--- a/content/news/2020/01/15/demo-fraud-detection.html
+++ b/content/news/2020/01/15/demo-fraud-detection.html
@@ -200,7 +200,7 @@
 In this series of blog posts you will learn about three powerful Flink 
patterns for building streaming applications:
 
 
-  Dynamic updates of application logic
+  Dynamic updates 
of application logic
   Dynamic data partitioning (shuffle), controlled at runtime
   Low latency alerting based on custom windowing logic (without using the 
window API)
 
@@ -400,7 +400,7 @@ To understand why this is the case, let us start with 
articulating a realistic s
 
 
 
-In the next article, we will see how Flink’s broadcast streams can be 
utilized to help steer the processing within the Fraud Detection engine at 
runtime (Dynamic Application Updates pattern).
+In the next 
article, we will see how Flink’s broadcast streams can be utilized to help 
steer the processing within the Fraud Detection engine at runtime (Dynamic 
Application Updates pattern).
 
   
 



[flink-web] branch asf-site updated (fb6e73a -> 55b6c7c)

2020-07-28 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git.


from fb6e73a  fix links and rebuild
 new 177fe4f  Link blogposts
 new 55b6c7c  Rebuild website

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 _posts/2020-01-15-demo-fraud-detection.md | 4 ++--
 content/blog/feed.xml | 6 +++---
 content/news/2020/01/15/demo-fraud-detection.html | 4 ++--
 3 files changed, 7 insertions(+), 7 deletions(-)



[flink-training] branch master updated (22420e1 -> 163558d)

2020-06-08 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git.


from 22420e1  [hotfix] use correct main class references
 new b7fecd4  [hotfix][training] git-ignore eclipse build files
 new 163558d  [FLINK-18178][build] fix Eclipse import not using 
flinkShadowJar dependencies

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .gitignore   | 10 --
 build.gradle |  7 +++
 2 files changed, 15 insertions(+), 2 deletions(-)



[flink-training] 02/02: [FLINK-18178][build] fix Eclipse import not using flinkShadowJar dependencies

2020-06-08 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git

commit 163558d39bc2c15796d180dc5ae8a7943d608679
Author: Nico Kruber 
AuthorDate: Mon Jun 8 14:05:56 2020 +0200

[FLINK-18178][build] fix Eclipse import not using flinkShadowJar 
dependencies
---
 build.gradle | 7 +++
 1 file changed, 7 insertions(+)

diff --git a/build.gradle b/build.gradle
index 5e54c95..5aae798 100644
--- a/build.gradle
+++ b/build.gradle
@@ -27,6 +27,7 @@ subprojects {
 }
 apply plugin: 'com.github.johnrengelman.shadow'
 apply plugin: 'checkstyle'
+apply plugin: 'eclipse'
 
 // artifact properties
 group = 'org.apache.flink'
@@ -113,6 +114,12 @@ subprojects {
 javadoc.classpath += configurations.flinkShadowJar
 }
 
+eclipse {
+classpath {
+plusConfigurations += [configurations.flinkShadowJar]
+}
+}
+
 if (plugins.findPlugin('application')) {
 applicationDefaultJvmArgs = ["-Dlog4j.configuration=log4j.properties"]
 run.classpath = sourceSets.main.runtimeClasspath



[flink-training] 01/02: [hotfix][training] git-ignore eclipse build files

2020-06-08 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git

commit b7fecd43861cbb7ef15c55f5887cc121f4b18acc
Author: Nico Kruber 
AuthorDate: Mon Jun 8 12:33:37 2020 +0200

[hotfix][training] git-ignore eclipse build files
---
 .gitignore | 10 --
 1 file changed, 8 insertions(+), 2 deletions(-)

diff --git a/.gitignore b/.gitignore
index ba34970..d47fbf8 100644
--- a/.gitignore
+++ b/.gitignore
@@ -10,9 +10,15 @@
 # Debugger
 .attach_*
 
+# Eclipse
+.project
+.settings
+.classpath
+bin/
+
 # Gradle build process files
-**/.gradle/**/*
-**/build/**/*
+/.gradle/
+build/
 **/.gradletasknamecache
 
 # IntelliJ



[flink-training] branch master updated: [hotfix] use correct main class references

2020-06-08 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-training.git


The following commit(s) were added to refs/heads/master by this push:
 new 22420e1  [hotfix] use correct main class references
22420e1 is described below

commit 22420e10bc970eea0eb84379fbf68a7d0d0eac78
Author: David Anderson 
AuthorDate: Mon Jun 8 12:31:46 2020 +0200

[hotfix] use correct main class references
---
 hourly-tips/build.gradle  | 2 +-
 long-ride-alerts/build.gradle | 2 +-
 ride-cleansing/build.gradle   | 2 +-
 rides-and-fares/build.gradle  | 2 +-
 4 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/hourly-tips/build.gradle b/hourly-tips/build.gradle
index e0b929e..11df511 100644
--- a/hourly-tips/build.gradle
+++ b/hourly-tips/build.gradle
@@ -1 +1 @@
-mainClassName = 
'org.apache.flink.training.exercises.windows.hourlytips.HourlyTipsExercise'
+mainClassName = 
'org.apache.flink.training.exercises.hourlytips.HourlyTipsExercise'
diff --git a/long-ride-alerts/build.gradle b/long-ride-alerts/build.gradle
index 61aa69b..62e4ffd 100644
--- a/long-ride-alerts/build.gradle
+++ b/long-ride-alerts/build.gradle
@@ -1 +1 @@
-mainClassName = 
'org.apache.flink.training.exercises.process.longrides.LongRidesExercise'
+mainClassName = 
'org.apache.flink.training.exercises.longrides.LongRidesExercise'
diff --git a/ride-cleansing/build.gradle b/ride-cleansing/build.gradle
index 2a9d0aa..106e3b9 100644
--- a/ride-cleansing/build.gradle
+++ b/ride-cleansing/build.gradle
@@ -1 +1 @@
-mainClassName = 
'org.apache.flink.training.exercises.basics.ridecleansing.RideCleansingExercise'
+mainClassName = 
'org.apache.flink.training.exercises.ridecleansing.RideCleansingExercise'
diff --git a/rides-and-fares/build.gradle b/rides-and-fares/build.gradle
index 1d013ef..18366af 100644
--- a/rides-and-fares/build.gradle
+++ b/rides-and-fares/build.gradle
@@ -1 +1 @@
-mainClassName = 
'org.apache.flink.training.exercises.state.ridesandfares.RidesAndFaresExercise'
+mainClassName = 
'org.apache.flink.training.exercises.ridesandfares.RidesAndFaresExercise'



[flink-benchmarks] branch master updated: [FLINK-17287][github] Disable merge commit button

2020-06-05 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-benchmarks.git


The following commit(s) were added to refs/heads/master by this push:
 new 8b44986  [FLINK-17287][github] Disable merge commit button
8b44986 is described below

commit 8b449865cf733dbb3c01e997fe44b1a5b6f82cdc
Author: Nico Kruber 
AuthorDate: Fri Jun 5 11:46:38 2020 +0200

[FLINK-17287][github] Disable merge commit button
---
 .asf.yaml | 5 +
 1 file changed, 5 insertions(+)

diff --git a/.asf.yaml b/.asf.yaml
new file mode 100644
index 000..930c896
--- /dev/null
+++ b/.asf.yaml
@@ -0,0 +1,5 @@
+github:
+  enabled_merge_buttons:
+squash: true
+merge: false
+rebase: true



[flink-benchmarks] branch master updated (c1db02d -> a37619d)

2020-06-05 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink-benchmarks.git.


from c1db02d  [init] Add ASL 2.0 License
 new e16896a  Add skeleton for micro benchmarks twith first benchmark
 new 0134b6a  Add KeyByBenchmark
 new 72afb9b  Add TimeWindowBenchmark
 new a68dca7  Dump results to csv file
 new 0016349  Add benchmarks for different state backends
 new 469188b  make KeyByBenchmark#SumReduce a static inner class
 new e4e9024  Extract user functions
 new a7947c4  Improve codespeed
 new 0c33a3e  Extract StateBackendBenchmark
 new c03925f  Reduce number of parameters
 new 90ab919  Bump number of iterations to production ready values
 new e1267b6  Remove fork value from pom file
 new 2953086  Change units in benchmarks
 new 35aeff7  Extract common logic for IntLong benchmarks
 new d42afaf  Benchmark state using tumbling windows
 new df6a089  Use different number of operations per invocation in memory 
and rocks benchmarks
 new 621f73e  Make all benchmark inherite from BenchmarkBase
 new 8fd1ae4  Extract BenchmarkBase for non integration Flink benchmarks
 new b662f8e  Clean up pom.xml
 new ca39899  Update Flink to 1.5-SNAPSHOT
 new 58687db  Run NetworkBenchmarks from flink
 new aca8216  Ignore empty parameters in benchmark name generation
 new 313bfd7  add benchmarks for the keyBy() operation on tuples and arrays
 new 4a453df  Remove incorrect comment from pom
 new 062f96b  Add README.md
 new 9e6eee1  [hotfix] make KeyByBenchmarks#IncreasingBaseSource more 
generic, creating BaseSourceWithKeyRange
 new 8cd86ae  add SerializationFrameworkBenchmarks for Pojo, Tuple, Kryo, 
and Avro
 new 426ac4c  Update default JVM options and bump default Flink version to 
1.7
 new a1d2ebe  Add network 1000,1ms benchmark
 new a8bc679  cleanup pom.xml (#8)
 new da30375  Add broadcast network 100,100ms benchmark
 new 9a95e89  clarify licensing situation with respect to jmh
 new 84532bd  add SSL network benchmarks and change overall benchmark set
 new 2ddbb7a  Add general remarks to the README
 new 21b97fe  [FLINK-11986] [state backend, tests] Add micro benchmark for 
state operations
 new 8b71d9c  Add support to generate shaded benchmark package to allow run 
specific case in command line
 new 5d7aec8  [hotfix] Missing quotes in example maven command in README
 new 97ffd65  [hotfix] Exclude state benchmark from the Flink suit
 new c25d921  [hotfix] Fix NoClassDefFoundError when running network 
related benchmarks
 new 136c166  Run all benchmarks once on Travis CI
 new ca2d06b  Use maven profile to differentiate travis and jenkins run
 new cd16e99  Update the dependent flink version to 1.9-SNAPSHOT
 new 4372ba0  Add InputBenchmark
 new 70fb9bd  Explain naming convention in the README
 new 9c338a9  Add couple of details to the README and a note how to pick 
length of a new benchmark
 new b2c1aa2  Add TwoInputBenchmark
 new 75d7b6e  [FLINK-11877] Add an input idle benchmark for two input 
operators
 new caf4917  [hotfix] Fix typo in static variable
 new 3b55f59  [hotfix] Fix formatting of pom.xml
 new 9e3833a  [hotfix] Add missing dependencies
 new ab0015d  [FLINK-12818] Improve stability of twoInputMapSink benchmark
 new a4ae3ae  [hotfix] adapt StreamNetworkThroughputBenchmarkExecutor to 
latest changes in Flink
 new c2f6793  Delete root directory when teardown the benchmark
 new 15064ae  Let list state benchmark stable
 new d47b530  Update flink-tests artifact id
 new 6ba88de  Adding benchmark for AsyncWaitOperator
 new 707905f  [FLINK-14118] Add throughput benchmark executor for data skew 
scenario. (#31)
 new 46b7ce9  [FLINK-14118] Bump number of writer threads in network data 
skew Benchmark to 10
 new 735bd06  [hotfix] Update flink version from 1.9 to 1.10.
 new 91d5e0b  [hotfix] Increase max memory of the test jvm to 6g.
 new fd337c6  [FLINK-14783] Add ContinuousFileReaderOperator benchmark (#34)
 new cb73ac5  [FLINK-13846] Add benchmark of mapstate isEmpty
 new 4625672  [FLINK-14346] [serialization] Add string-heavy version of 
SerializationFrameworkMiniBenchmark
 new cd6f0a3  [hotfix] Make StateBackendContext extends 
FlinkEnvironmentContext for deduplication.
 new f54e93b  [FLINK-15103] Set number of network buffers to be the same as 
before flip49 for all benchmarks using local executor.
 new 322d34d  [hotfix] Bump default Flink version to 1.11-SNAPSHOT
 new bbac57c  [FLINK-15199] Fix compile error after FLINK-14926
 new e64a0e0  [FLINK-15070] Add benchmarks for blocking partition
 new a63a4bc  fixup! [FLINK-15103] Set number of network buffers to be the 
same as before flip49 for all benchmarks using local executor.
 new cfcbb96

[flink-web] 01/02: [FLINK-17490] Add training page

2020-05-05 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit ac0064a703c88b68ba9f6492129f9f201c3ac085
Author: David Anderson 
AuthorDate: Fri May 1 13:21:35 2020 +0200

[FLINK-17490] Add training page

This closes #333.
---
 _data/i18n.yml|   4 +-
 _includes/navbar.html |   1 +
 training.md   | 109 ++
 training.zh.md| 109 ++
 4 files changed, 222 insertions(+), 1 deletion(-)

diff --git a/_data/i18n.yml b/_data/i18n.yml
index f5342be..39548af 100644
--- a/_data/i18n.yml
+++ b/_data/i18n.yml
@@ -23,6 +23,7 @@ en:
 contribute_website: Contribute to the Website
 roadmap: Roadmap
 tutorials: Tutorials
+training_course: Training Course
 
 zh:
 what_is_flink: Apache Flink 是什么?
@@ -48,4 +49,5 @@ zh:
 docs_style_guide: Documentation Style Guide
 contribute_website: 贡献网站
 roadmap: 开发计划
-tutorials: 教程
\ No newline at end of file
+tutorials: 教程
+training_course: Training Course
diff --git a/_includes/navbar.html b/_includes/navbar.html
index 1603d57..d9cbb0a 100755
--- a/_includes/navbar.html
+++ b/_includes/navbar.html
@@ -71,6 +71,7 @@
   
 With Flink 
 With Flink Stateful 
Functions 
+{{ 
site.data.i18n[page.language].training_course }}
   
 
 
diff --git a/training.md b/training.md
new file mode 100644
index 000..0532f54
--- /dev/null
+++ b/training.md
@@ -0,0 +1,109 @@
+---
+title: "Training Course"
+---
+
+
+
+The Apache Flink community maintains a self-paced training course that contains
+a set of lessons and hands-on exercises. This step-by-step introduction to 
Flink focuses
+on learning how to use the DataStream API to meet the needs of common, 
real-world use cases.
+
+This training covers the fundamentals of Flink, including:
+
+
+
+
+
+  Intro 
to Flink
+
+
+
+Batch vs. Streaming
+Parallel Dataflows
+State, Time, and Snapshots
+
+
+
+
+
+
+
+ Intro to 
the DataStream API
+
+
+
+Data Types and Serialization
+Architecture
+Sources and Sinks
+
+
+
+
+
+
+
+ Data 
Pipelines and ETL
+
+
+
+Transformations
+Stateful Stream Processing
+Connected Streams
+
+
+
+
+
+
+
+
+
+ Streaming 
Analytics
+
+
+
+Event Time Processing
+Watermarks
+Windows
+
+
+
+
+
+
+
+ 
Event-driven Applications
+
+
+
+Process Functions
+Timers
+Side Outputs
+
+
+
+
+
+
+
+ Fault 
Tolerance
+
+
+
+Checkpoints and Savepoints
+Exactly-once vs. At-least-once
+Exactly-once End-to-end
+
+
+
+
+
+
+
+Apache Flink Training Course   
+
+
+
diff --git a/training.zh.md b/training.zh.md
new file mode 100644
index 000..b5ec9ae
--- /dev/null
+++ b/training.zh.md
@@ -0,0 +1,109 @@
+---
+title: "Training Course"
+---
+
+
+
+The Apache Flink community maintains a self-paced training course that contains
+a set of lessons and hands-on exercises. This step-by-step introduction to 
Flink focuses
+on learning how to use the DataStream API to meet the needs of common, 
real-world use cases.
+
+This training covers the fundamentals of Flink, including:
+
+
+
+
+
+  Intro 
to Flink
+
+
+
+Batch vs. Streaming
+Parallel Dataflows
+State, Time, and Snapshots
+
+
+
+
+
+
+
+ Intro to 
the DataStream API
+
+
+
+Data Types and Serialization
+Architecture
+Sources and Sinks
+
+
+
+
+
+
+
+ Dat

[flink-web] branch asf-site updated (f6c2c95 -> 827e876)

2020-05-05 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git.


from f6c2c95  rebuild website
 new ac0064a  [FLINK-17490] Add training page
 new 827e876  Rebuild website

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 _data/i18n.yml |   4 +-
 _includes/navbar.html  |   1 +
 content/2019/05/03/pulsar-flink.html   |   1 +
 content/2019/05/14/temporal-tables.html|   1 +
 content/2019/05/19/state-ttl.html  |   1 +
 content/2019/06/05/flink-network-stack.html|   1 +
 content/2019/06/26/broadcast-state.html|   1 +
 content/2019/07/23/flink-network-stack-2.html  |   1 +
 content/2020/04/09/pyflink-udf-support-flink.html  |   1 +
 content/blog/index.html|   1 +
 content/blog/page10/index.html |   1 +
 content/blog/page11/index.html |   1 +
 content/blog/page2/index.html  |   1 +
 content/blog/page3/index.html  |   1 +
 content/blog/page4/index.html  |   1 +
 content/blog/page5/index.html  |   1 +
 content/blog/page6/index.html  |   1 +
 content/blog/page7/index.html  |   1 +
 content/blog/page8/index.html  |   1 +
 content/blog/page9/index.html  |   1 +
 .../blog/release_1.0.0-changelog_known_issues.html |   1 +
 content/blog/release_1.1.0-changelog.html  |   1 +
 content/blog/release_1.2.0-changelog.html  |   1 +
 content/blog/release_1.3.0-changelog.html  |   1 +
 content/community.html |   1 +
 .../code-style-and-quality-common.html |   1 +
 .../code-style-and-quality-components.html |   1 +
 .../code-style-and-quality-formatting.html |   1 +
 .../contributing/code-style-and-quality-java.html  |   1 +
 .../code-style-and-quality-preamble.html   |   1 +
 .../code-style-and-quality-pull-requests.html  |   1 +
 .../contributing/code-style-and-quality-scala.html |   1 +
 content/contributing/contribute-code.html  |   1 +
 content/contributing/contribute-documentation.html |   1 +
 content/contributing/docs-style.html   |   1 +
 content/contributing/how-to-contribute.html|   1 +
 content/contributing/improve-website.html  |   1 +
 content/contributing/reviewing-prs.html|   1 +
 content/documentation.html |   1 +
 content/downloads.html |   1 +
 content/ecosystem.html |   1 +
 .../apache-beam-how-beam-runs-on-top-of-flink.html |   1 +
 .../feature/2019/09/13/state-processor-api.html|   1 +
 .../2017/07/04/flink-rescalable-state.html |   1 +
 .../2018/01/30/incremental-checkpointing.html  |   1 +
 .../01/end-to-end-exactly-once-apache-flink.html   |   1 +
 .../features/2019/03/11/prometheus-monitoring.html |   1 +
 .../2020/03/27/flink-for-data-warehouse.html   |   1 +
 content/flink-applications.html|   1 +
 content/flink-architecture.html|   1 +
 content/flink-operations.html  |   1 +
 content/gettinghelp.html   |   1 +
 content/index.html |   1 +
 content/material.html  |   1 +
 content/news/2014/08/26/release-0.6.html   |   1 +
 content/news/2014/09/26/release-0.6.1.html |   1 +
 content/news/2014/10/03/upcoming_events.html   |   1 +
 content/news/2014/11/04/release-0.7.0.html |   1 +
 content/news/2014/11/18/hadoop-compatibility.html  |   1 +
 content/news/2015/01/06/december-in-flink.html |   1 +
 content/news/2015/01/21/release-0.8.html   |   1 +
 content/news/2015/02/04/january-in-flink.html  |   1 +
 content/news/2015/02/09/streaming-example.html |   1 +
 .../news/2015/03/02/february-2015-in-flink.html|   1 +
 .../13/peeking-into-Apache-Flinks-Engine-Room.html |   1 +
 content/news/2015/04/07/march-in-flink.html|   1 +
 .../news/2015/04/13/release-0.9.0-milestone1.html  |   1 +
 .../2015/05/11/Juggling-with-Bits-and-Bytes.html   |   1 +
 .../news/2015/05/14/Community-update-April.html|   1 +
 .../24/announcing-apache-flink-0.9.0-release.html  |   1 +
 .../news/2015/08/24/introducing-flink-gelly.html   |   1 +
 content/news/2015/09/01/release-0.9.1.html |   1 +
 content/news/2015/09/03/flink-forward.html |   1 +
 content/news/2015/09/16/off-heap-mem

[flink] branch master updated: [FLINK-17244][docs] Update the Getting Started page (#11988)

2020-05-05 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 4cb58e7  [FLINK-17244][docs] Update the Getting Started page (#11988)
4cb58e7 is described below

commit 4cb58e747120b225bf6c2802e67bb0ca170dbf38
Author: David Anderson 
AuthorDate: Tue May 5 17:06:08 2020 +0200

[FLINK-17244][docs] Update the Getting Started page (#11988)

* Update docs/getting-started/index.md
---
 docs/getting-started/index.md | 45 ++-
 1 file changed, 32 insertions(+), 13 deletions(-)

diff --git a/docs/getting-started/index.md b/docs/getting-started/index.md
index 7d81c48..fffc502 100644
--- a/docs/getting-started/index.md
+++ b/docs/getting-started/index.md
@@ -28,21 +28,22 @@ under the License.
 -->
 
 There are many ways to get started with Apache Flink. Which one is the best for
-you depends on your goal and prior experience:
+you depends on your goals and prior experience:
 
-* take a look at the **Docker Playgrounds** for a docker-based introduction to
-  specific Flink concepts
-* explore on of the **Code Walkthroughs** if you want to get an end-to-end
-  introduction to using one of the Flink APIs
-* use **Project Setup** if you already know the basics of Flink but want to 
get a
-  project setup template for Java or Scala and need help setting up
-  dependencies
+* take a look at the **Docker Playgrounds** if you want to see what Flink can 
do, via a hands-on,
+  docker-based introduction to specific Flink concepts
+* explore one of the **Code Walkthroughs** if you want a quick, end-to-end
+  introduction to one of Flink's APIs
+* work your way through the **Hands-on Training** for a comprehensive,
+  step-by-step introduction to Flink
+* use **Project Setup** if you already know the basics of Flink and want a
+  project template for Java or Scala, or need help setting up the dependencies
 
 ### Taking a first look at Flink
 
 The **Docker Playgrounds** provide sandboxed Flink environments that are set 
up in just a few minutes and which allow you to explore and play with Flink.
 
-* The [**Operations 
Playground**](./docker-playgrounds/flink-operations-playground.html) shows you 
how to operate streaming applications with Flink. You can experience how Flink 
recovers application from failures, upgrade and scale streaming applications up 
and down, and query application metrics.
+* The [**Operations Playground**]({% link 
getting-started/docker-playgrounds/flink-operations-playground.md %}) shows you 
how to operate streaming applications with Flink. You can experience how Flink 
recovers application from failures, upgrade and scale streaming applications up 
and down, and query application metrics.
 
 

[flink] branch master updated (ea51116 -> c955eb3)

2020-04-28 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from ea51116  [FLINK-17111][table] Support SHOW VIEWS in Flink SQL
 add c955eb3  [FLINK-17432][docs-training] Rename Tutorials to Training for 
better SEO (#11931)

No new revisions were added by this update.

Summary of changes:
 docs/concepts/index.md | 10 -
 docs/concepts/index.zh.md  | 10 -
 docs/redirects/tutorials_overview.md   | 24 ++
 docs/{tutorials => training}/datastream_api.md |  7 +++
 docs/{tutorials => training}/datastream_api.zh.md  |  7 +++
 docs/{tutorials => training}/etl.md|  8 
 docs/{tutorials => training}/etl.zh.md |  8 
 docs/{tutorials => training}/event_driven.md   |  8 
 docs/{tutorials => training}/event_driven.zh.md|  8 
 docs/{tutorials => training}/fault_tolerance.md|  2 +-
 docs/{tutorials => training}/fault_tolerance.zh.md |  2 +-
 docs/{tutorials => training}/index.md  | 22 ++--
 docs/{tutorials => training}/index.zh.md   | 22 ++--
 .../{tutorials => training}/streaming_analytics.md |  8 
 .../streaming_analytics.zh.md  |  8 
 15 files changed, 88 insertions(+), 66 deletions(-)
 create mode 100644 docs/redirects/tutorials_overview.md
 rename docs/{tutorials => training}/datastream_api.md (97%)
 rename docs/{tutorials => training}/datastream_api.zh.md (97%)
 rename docs/{tutorials => training}/etl.md (98%)
 rename docs/{tutorials => training}/etl.zh.md (98%)
 rename docs/{tutorials => training}/event_driven.md (97%)
 rename docs/{tutorials => training}/event_driven.zh.md (97%)
 rename docs/{tutorials => training}/fault_tolerance.md (99%)
 rename docs/{tutorials => training}/fault_tolerance.zh.md (99%)
 rename docs/{tutorials => training}/index.md (93%)
 rename docs/{tutorials => training}/index.zh.md (93%)
 rename docs/{tutorials => training}/streaming_analytics.md (98%)
 rename docs/{tutorials => training}/streaming_analytics.zh.md (98%)



  1   2   3   4   >