[flink] branch master updated (a884706 -> 4b25ba2)

2020-07-30 Thread dianfu
This is an automated email from the ASF dual-hosted git repository.

dianfu pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from a884706  [FLINK-12175] Change filling of typeHierarchy in analyzePojo, 
for correctly creating fields TypeInfo
 add 4b25ba2  [hotfix][docs] Fix 'event_timestamp_extractors.zh.md' by 
updating the chinese doc link

No new revisions were added by this update.

Summary of changes:
 docs/dev/event_timestamp_extractors.zh.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[flink] branch master updated (a884706 -> 4b25ba2)

2020-07-30 Thread dianfu
This is an automated email from the ASF dual-hosted git repository.

dianfu pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from a884706  [FLINK-12175] Change filling of typeHierarchy in analyzePojo, 
for correctly creating fields TypeInfo
 add 4b25ba2  [hotfix][docs] Fix 'event_timestamp_extractors.zh.md' by 
updating the chinese doc link

No new revisions were added by this update.

Summary of changes:
 docs/dev/event_timestamp_extractors.zh.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[flink-web] branch asf-site updated (734f4a1 -> 008e907)

2020-07-30 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a change to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git.


from 734f4a1  Rebuild website
 new 707dd32  Add Blogbost: Advanced Flink Application Patterns: Custom 
Window Processing
 new 008e907  Rebuild website

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 _posts/2020-07-30-demo-fraud-detection-3.md|  660 
 content/blog/feed.xml  | 1115 
 content/blog/index.html|   38 +-
 content/blog/page10/index.html |   38 +-
 content/blog/page11/index.html |   44 +-
 content/blog/page12/index.html |   45 +-
 content/blog/page13/index.html |   25 +
 content/blog/page2/index.html  |   40 +-
 content/blog/page3/index.html  |   38 +-
 content/blog/page4/index.html  |   36 +-
 content/blog/page5/index.html  |   38 +-
 content/blog/page6/index.html  |   40 +-
 content/blog/page7/index.html  |   38 +-
 content/blog/page8/index.html  |   36 +-
 content/blog/page9/index.html  |   37 +-
 .../img/blog/patterns-blog-3/evaluation-delays.png |  Bin 0 -> 29120 bytes
 .../blog/patterns-blog-3/keyed-state-scoping.png   |  Bin 0 -> 199113 bytes
 content/img/blog/patterns-blog-3/late-events.png   |  Bin 0 -> 20483 bytes
 .../img/blog/patterns-blog-3/pre-aggregation.png   |  Bin 0 -> 33817 bytes
 .../patterns-blog-3/sample-rule-definition.png |  Bin 0 -> 98413 bytes
 content/img/blog/patterns-blog-3/time-windows.png  |  Bin 0 -> 37632 bytes
 content/img/blog/patterns-blog-3/type-kryo.png |  Bin 0 -> 28294 bytes
 content/img/blog/patterns-blog-3/type-pojo.png |  Bin 0 -> 34853 bytes
 content/img/blog/patterns-blog-3/widest-window.png |  Bin 0 -> 90233 bytes
 .../img/blog/patterns-blog-3/window-clean-up.png   |  Bin 0 -> 15498 bytes
 content/index.html |   13 +-
 .../news/2020/07/30/demo-fraud-detection-3.html|  908 
 content/zh/index.html  |   13 +-
 img/blog/patterns-blog-3/evaluation-delays.png |  Bin 0 -> 29120 bytes
 img/blog/patterns-blog-3/keyed-state-scoping.png   |  Bin 0 -> 199113 bytes
 img/blog/patterns-blog-3/late-events.png   |  Bin 0 -> 20483 bytes
 img/blog/patterns-blog-3/pre-aggregation.png   |  Bin 0 -> 33817 bytes
 .../patterns-blog-3/sample-rule-definition.png |  Bin 0 -> 98413 bytes
 img/blog/patterns-blog-3/time-windows.png  |  Bin 0 -> 37632 bytes
 img/blog/patterns-blog-3/type-kryo.png |  Bin 0 -> 28294 bytes
 img/blog/patterns-blog-3/type-pojo.png |  Bin 0 -> 34853 bytes
 img/blog/patterns-blog-3/widest-window.png |  Bin 0 -> 90233 bytes
 img/blog/patterns-blog-3/window-clean-up.png   |  Bin 0 -> 15498 bytes
 38 files changed, 2791 insertions(+), 411 deletions(-)
 create mode 100644 _posts/2020-07-30-demo-fraud-detection-3.md
 create mode 100644 content/img/blog/patterns-blog-3/evaluation-delays.png
 create mode 100644 content/img/blog/patterns-blog-3/keyed-state-scoping.png
 create mode 100644 content/img/blog/patterns-blog-3/late-events.png
 create mode 100644 content/img/blog/patterns-blog-3/pre-aggregation.png
 create mode 100644 content/img/blog/patterns-blog-3/sample-rule-definition.png
 create mode 100644 content/img/blog/patterns-blog-3/time-windows.png
 create mode 100644 content/img/blog/patterns-blog-3/type-kryo.png
 create mode 100644 content/img/blog/patterns-blog-3/type-pojo.png
 create mode 100644 content/img/blog/patterns-blog-3/widest-window.png
 create mode 100644 content/img/blog/patterns-blog-3/window-clean-up.png
 create mode 100644 content/news/2020/07/30/demo-fraud-detection-3.html
 create mode 100644 img/blog/patterns-blog-3/evaluation-delays.png
 create mode 100644 img/blog/patterns-blog-3/keyed-state-scoping.png
 create mode 100644 img/blog/patterns-blog-3/late-events.png
 create mode 100644 img/blog/patterns-blog-3/pre-aggregation.png
 create mode 100644 img/blog/patterns-blog-3/sample-rule-definition.png
 create mode 100644 img/blog/patterns-blog-3/time-windows.png
 create mode 100644 img/blog/patterns-blog-3/type-kryo.png
 create mode 100644 img/blog/patterns-blog-3/type-pojo.png
 create mode 100644 img/blog/patterns-blog-3/widest-window.png
 create mode 100644 img/blog/patterns-blog-3/window-clean-up.png



[flink-web] 01/02: Add Blogbost: Advanced Flink Application Patterns: Custom Window Processing

2020-07-30 Thread nkruber
This is an automated email from the ASF dual-hosted git repository.

nkruber pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit 707dd32128cf5a10c9ffda807db4e34694c96190
Author: Alexander Fedulov <1492164+afedu...@users.noreply.github.com>
AuthorDate: Wed Jul 22 14:01:40 2020 +0200

Add Blogbost: Advanced Flink Application Patterns: Custom Window Processing

This closes #362.
---
 _posts/2020-07-30-demo-fraud-detection-3.md| 660 +
 img/blog/patterns-blog-3/evaluation-delays.png | Bin 0 -> 29120 bytes
 img/blog/patterns-blog-3/keyed-state-scoping.png   | Bin 0 -> 199113 bytes
 img/blog/patterns-blog-3/late-events.png   | Bin 0 -> 20483 bytes
 img/blog/patterns-blog-3/pre-aggregation.png   | Bin 0 -> 33817 bytes
 .../patterns-blog-3/sample-rule-definition.png | Bin 0 -> 98413 bytes
 img/blog/patterns-blog-3/time-windows.png  | Bin 0 -> 37632 bytes
 img/blog/patterns-blog-3/type-kryo.png | Bin 0 -> 28294 bytes
 img/blog/patterns-blog-3/type-pojo.png | Bin 0 -> 34853 bytes
 img/blog/patterns-blog-3/widest-window.png | Bin 0 -> 90233 bytes
 img/blog/patterns-blog-3/window-clean-up.png   | Bin 0 -> 15498 bytes
 11 files changed, 660 insertions(+)

diff --git a/_posts/2020-07-30-demo-fraud-detection-3.md 
b/_posts/2020-07-30-demo-fraud-detection-3.md
new file mode 100644
index 000..a96ab03
--- /dev/null
+++ b/_posts/2020-07-30-demo-fraud-detection-3.md
@@ -0,0 +1,660 @@
+---
+layout: post
+title: "Advanced Flink Application Patterns Vol.3: Custom Window Processing"
+date: 2020-07-30T12:00:00.000Z
+authors:
+- alex:
+  name: "Alexander Fedulov"
+  twitter: "alex_fedulov"
+categories: news
+excerpt: In this series of blog posts you will learn about powerful Flink 
patterns for building streaming applications.
+---
+
+
+.tg  {border-collapse:collapse;border-spacing:0;}
+.tg td{padding:10px 
10px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;}
+.tg th{padding:10px 
10px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;background-color:#eff0f1;}
+.tg .tg-wide{padding:10px 30px;}
+.tg .tg-top{vertical-align:top}
+.tg .tg-topcenter{text-align:center;vertical-align:top}
+.tg .tg-center{text-align:center;vertical-align:center}
+
+
+## Introduction
+
+In the previous articles of the series, we described how you can achieve
+flexible stream partitioning based on dynamically-updated configurations
+(a set of fraud-detection rules) and how you can utilize Flink\'s
+Broadcast mechanism to distribute processing configuration at runtime
+among the relevant operators. 
+
+Following up directly where we left the discussion of the end-to-end
+solution last time, in this article we will describe how you can use the
+\"Swiss knife\" of Flink - the [*Process 
Function*](https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/stream/operators/process_function.html)
 to create an
+implementation that is tailor-made to match your streaming business
+logic requirements. Our discussion will continue in the context of the
+[Fraud Detection engine]({{ site.baseurl 
}}/news/2020/01/15/demo-fraud-detection.html#fraud-detection-demo). We will 
also demonstrate how you can
+implement your own **custom replacement for time windows** for cases
+where the out-of-the-box windowing available from the DataStream API
+does not satisfy your requirements. In particular, we will look at the
+trade-offs that you can make when designing a solution which requires
+low-latency reactions to individual events.
+
+This article will describe some high-level concepts that can be applied
+independently, but it is recommended that you review the material in
+[part one]({{ site.baseurl }}/news/2020/01/15/demo-fraud-detection.html) and
+[part two]({{ site.baseurl }}/news/2020/03/24/demo-fraud-detection-2.html) of 
the series as well as checkout the [code
+base](https://github.com/afedulov/fraud-detection-demo) in order to make
+it easier to follow along.
+
+## ProcessFunction as a "Window"
+
+### Low Latency
+
+Let's start with a reminder of the type of fraud detection rule that we
+would like to support:
+
+*"Whenever the **sum** of  **payments** from the same **payer** to the
+same **beneficiary** within **a 24 hour
+period** is **greater** than **200 000 \$** - trigger an alert."*
+
+In other words, given a stream of transactions partitioned by a key that
+combines the payer and the beneficiary fields, we would like to look
+back in time and determine, for each incoming transaction, if the sum of
+all previous payments between the two specific participants exceeds the
+defined threshold. In effect, the computation window is always moved
+along to the position of the last observed event for a particular data
+partitioning key.
+
+
+
+
+Figure 1: Time Windows
+
+
+
+
+One of the common key requirements for a fraud detection system is *low
+response 

[flink] 03/04: [hotfix] Use List instead of ArrayList in TypeExtractor

2020-07-30 Thread dwysakowicz
This is an automated email from the ASF dual-hosted git repository.

dwysakowicz pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit d05960c1ecefe622e297cd41a38d68ada3016f28
Author: Dawid Wysakowicz 
AuthorDate: Wed Jul 29 14:21:06 2020 +0200

[hotfix] Use List instead of ArrayList in TypeExtractor
---
 .../flink/api/java/typeutils/TypeExtractor.java| 50 +++---
 .../flink/formats/avro/typeutils/AvroTypeInfo.java |  4 +-
 2 files changed, 27 insertions(+), 27 deletions(-)

diff --git 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
index a990f76..318f10d 100644
--- 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
+++ 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
@@ -749,7 +749,7 @@ public class TypeExtractor {
// --- private methods 

 
private TypeInformation privateCreateTypeInfo(Type t) {
-   ArrayList typeHierarchy = new ArrayList<>();
+   List typeHierarchy = new ArrayList<>();
typeHierarchy.add(t);
return createTypeInfoWithTypeHierarchy(typeHierarchy, t, null, 
null);
}
@@ -758,7 +758,7 @@ public class TypeExtractor {
@SuppressWarnings("unchecked")
private  TypeInformation 
privateCreateTypeInfo(Class baseClass, Class clazz, int returnParamPos,
TypeInformation in1Type, TypeInformation 
in2Type) {
-   ArrayList typeHierarchy = new ArrayList<>();
+   List typeHierarchy = new ArrayList<>();
Type returnType = getParameterType(baseClass, typeHierarchy, 
clazz, returnParamPos);
 
TypeInformation typeInfo;
@@ -778,14 +778,14 @@ public class TypeExtractor {
 
// for LambdaFunctions
private  TypeInformation privateCreateTypeInfo(Type 
returnType, TypeInformation in1Type, TypeInformation in2Type) {
-   ArrayList typeHierarchy = new ArrayList<>();
+   List typeHierarchy = new ArrayList<>();
 
// get info from hierarchy
return createTypeInfoWithTypeHierarchy(typeHierarchy, 
returnType, in1Type, in2Type);
}
 
@SuppressWarnings({ "unchecked", "rawtypes" })
-   private  TypeInformation 
createTypeInfoWithTypeHierarchy(ArrayList typeHierarchy, Type t,
+   private  TypeInformation 
createTypeInfoWithTypeHierarchy(List typeHierarchy, Type t,
TypeInformation in1Type, TypeInformation 
in2Type) {
 
// check if type information can be created using a type factory
@@ -902,7 +902,7 @@ public class TypeExtractor {
throw new InvalidTypesException("Type Information could not be 
created.");
}
 
-   private  TypeInformation 
createTypeInfoFromInputs(TypeVariable returnTypeVar, ArrayList 
returnTypeHierarchy,
+   private  TypeInformation 
createTypeInfoFromInputs(TypeVariable returnTypeVar, List 
returnTypeHierarchy,
TypeInformation in1TypeInfo, TypeInformation 
in2TypeInfo) {
 
Type matReturnTypeVar = 
materializeTypeVariable(returnTypeHierarchy, returnTypeVar);
@@ -921,7 +921,7 @@ public class TypeExtractor {
}
 
// create a new type hierarchy for the input
-   ArrayList inputTypeHierarchy = new ArrayList<>();
+   List inputTypeHierarchy = new ArrayList<>();
// copy the function part of the type hierarchy
for (Type t : returnTypeHierarchy) {
Class clazz = typeToClass(t);
@@ -965,11 +965,11 @@ public class TypeExtractor {
 * Return the type information for "returnTypeVar" given that "inType" 
has type information "inTypeInfo".
 * Thus "inType" must contain "returnTypeVar" in a 
"inputTypeHierarchy", otherwise null is returned.
 */
-   private  TypeInformation 
createTypeInfoFromInput(TypeVariable returnTypeVar, ArrayList 
inputTypeHierarchy, Type inType, TypeInformation inTypeInfo) {
+   private  TypeInformation 
createTypeInfoFromInput(TypeVariable returnTypeVar, List 
inputTypeHierarchy, Type inType, TypeInformation inTypeInfo) {
TypeInformation info = null;
 
// use a factory to find corresponding type information to type 
variable
-   final ArrayList factoryHierarchy = new 
ArrayList<>(inputTypeHierarchy);
+   final List factoryHierarchy = new 
ArrayList<>(inputTypeHierarchy);
final TypeInfoFactory factory = 
getClosestFactory(factoryHierarchy, inType);
if (factory != null) {
// the type that defines the factory is last in factory 
hierarchy
@@ -1055,7 +1055,7 @@ 

[flink] 02/04: [hotfix] Remove warnings in TypeExtractor, AvroTypeInfo

2020-07-30 Thread dwysakowicz
This is an automated email from the ASF dual-hosted git repository.

dwysakowicz pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 3309d14b1abec8f9303f5158a57eee17c4de92d2
Author: Dawid Wysakowicz 
AuthorDate: Wed Jul 29 14:06:52 2020 +0200

[hotfix] Remove warnings in TypeExtractor, AvroTypeInfo
---
 .../api/java/typeutils/TypeExtractionUtils.java|   7 +-
 .../flink/api/java/typeutils/TypeExtractor.java| 118 ++---
 .../flink/formats/avro/typeutils/AvroTypeInfo.java |  10 +-
 3 files changed, 63 insertions(+), 72 deletions(-)

diff --git 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
index eef309e..600ea8e 100644
--- 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
+++ 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
@@ -257,12 +257,13 @@ public class TypeExtractionUtils {
/**
 * Convert ParameterizedType or Class to a Class.
 */
-   public static Class typeToClass(Type t) {
+   @SuppressWarnings("unchecked")
+   public static  Class typeToClass(Type t) {
if (t instanceof Class) {
-   return (Class)t;
+   return (Class) t;
}
else if (t instanceof ParameterizedType) {
-   return ((Class) ((ParameterizedType) 
t).getRawType());
+   return ((Class) ((ParameterizedType) 
t).getRawType());
}
throw new IllegalArgumentException("Cannot convert type to 
class");
}
diff --git 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
index b94dd5f..a990f76 100644
--- 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
+++ 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
@@ -146,7 +146,7 @@ public class TypeExtractor {
String functionName, boolean allowMissing)
{
return getUnaryOperatorReturnType(
-   (Function) mapInterface,
+   mapInterface,
MapFunction.class,
0,
1,
@@ -167,7 +167,7 @@ public class TypeExtractor {
String functionName, boolean allowMissing)
{
return getUnaryOperatorReturnType(
-   (Function) flatMapInterface,
+   flatMapInterface,
FlatMapFunction.class,
0,
1,
@@ -195,7 +195,7 @@ public class TypeExtractor {
public static  TypeInformation 
getFoldReturnTypes(FoldFunction foldInterface, TypeInformation 
inType, String functionName, boolean allowMissing)
{
return getUnaryOperatorReturnType(
-   (Function) foldInterface,
+   foldInterface,
FoldFunction.class,
0,
1,
@@ -251,7 +251,7 @@ public class TypeExtractor {
String functionName, boolean allowMissing)
{
return getUnaryOperatorReturnType(
-   (Function) mapPartitionInterface,
+   mapPartitionInterface,
MapPartitionFunction.class,
0,
1,
@@ -271,7 +271,7 @@ public class TypeExtractor {
String functionName, boolean allowMissing)
{
return getUnaryOperatorReturnType(
-   (Function) groupReduceInterface,
+   groupReduceInterface,
GroupReduceFunction.class,
0,
1,
@@ -291,7 +291,7 @@ public class TypeExtractor {

String 
functionName, boolean allowMissing)
{
return getUnaryOperatorReturnType(
-   (Function) combineInterface,
+   combineInterface,
GroupCombineFunction.class,
0,
1,
@@ -313,7 +313,7 @@ public class TypeExtractor {
TypeInformation in1Type, TypeInformation 
in2Type, String functionName, boolean allowMissing)
{
return getBinaryOperatorReturnType(
-   (Function) joinInterface,
+   joinInterface,
FlatJoinFunction.class,

[flink] 01/04: [hotfix] Remove dead code in TypeExtractor

2020-07-30 Thread dwysakowicz
This is an automated email from the ASF dual-hosted git repository.

dwysakowicz pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 250a6c1ca911535efb5a03433e3f95daf8be3ff7
Author: Dawid Wysakowicz 
AuthorDate: Wed Jul 29 13:53:54 2020 +0200

[hotfix] Remove dead code in TypeExtractor
---
 .../flink/api/common/typeinfo/TypeInfoFactory.java |  4 +-
 .../flink/api/java/typeutils/TypeExtractor.java| 72 --
 2 files changed, 13 insertions(+), 63 deletions(-)

diff --git 
a/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInfoFactory.java
 
b/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInfoFactory.java
index 898b05e..54f9335 100644
--- 
a/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInfoFactory.java
+++ 
b/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInfoFactory.java
@@ -19,7 +19,6 @@
 package org.apache.flink.api.common.typeinfo;
 
 import org.apache.flink.annotation.Public;
-import org.apache.flink.api.java.typeutils.TypeExtractor;
 
 import java.lang.reflect.Type;
 import java.util.Map;
@@ -29,8 +28,7 @@ import java.util.Map;
  * plugging-in user-defined {@link TypeInformation} into the Flink type 
system. The factory is
  * called during the type extraction phase if the corresponding type has been 
annotated with
  * {@link TypeInfo}. In a hierarchy of types the closest factory will be 
chosen while traversing
- * upwards, however, a globally registered factory has highest precedence
- * (see {@link TypeExtractor#registerFactory(Type, Class)}).
+ * upwards.
  *
  * @param  type for which {@link TypeInformation} is created
  */
diff --git 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
index adcfdbb..b94dd5f 100644
--- 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
+++ 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
@@ -18,7 +18,6 @@
 
 package org.apache.flink.api.java.typeutils;
 
-import org.apache.commons.lang3.ClassUtils;
 import org.apache.flink.annotation.Internal;
 import org.apache.flink.annotation.Public;
 import org.apache.flink.annotation.PublicEvolving;
@@ -53,6 +52,8 @@ import org.apache.flink.types.Row;
 import org.apache.flink.types.Value;
 import org.apache.flink.util.InstantiationUtil;
 import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.lang3.ClassUtils;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
@@ -72,14 +73,14 @@ import java.util.HashSet;
 import java.util.List;
 import java.util.Map;
 
-import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.getTypeHierarchy;
-import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.hasSuperclass;
-import static org.apache.flink.util.Preconditions.checkNotNull;
 import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.checkAndExtractLambda;
 import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.getAllDeclaredMethods;
+import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.getTypeHierarchy;
+import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.hasSuperclass;
 import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.isClassType;
 import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.sameTypeVars;
 import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.typeToClass;
+import static org.apache.flink.util.Preconditions.checkNotNull;
 
 /**
  * A utility for reflection analysis on classes, to determine the return type 
of implementations of transformation
@@ -132,34 +133,6 @@ public class TypeExtractor {
}
 
// 

-   //  TypeInfoFactory registry
-   // 

-
-   private static Map> 
registeredTypeInfoFactories = new HashMap<>();
-
-   /**
-* Registers a type information factory globally for a certain type. 
Every following type extraction
-* operation will use the provided factory for this type. The factory 
will have highest precedence
-* for this type. In a hierarchy of types the registered factory has 
higher precedence than annotations
-* at the same level but lower precedence than factories defined down 
the hierarchy.
-*
-* @param t type for which a new factory is registered
-* @param factory type information factory that will produce {@link 
TypeInformation}
-*/
-   private static void registerFactory(Type t, Class factory) {
-   Preconditions.checkNotNull(t, "Type parameter must not be 
null.");
-   

[flink] branch master updated (1c09c23 -> a884706)

2020-07-30 Thread dwysakowicz
This is an automated email from the ASF dual-hosted git repository.

dwysakowicz pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 1c09c23  [FLINK-16048][avro] Support read/write confluent schema 
registry avro data from Kafka
 new 250a6c1  [hotfix] Remove dead code in TypeExtractor
 new 3309d14  [hotfix] Remove warnings in TypeExtractor, AvroTypeInfo
 new d05960c  [hotfix] Use List instead of ArrayList in TypeExtractor
 new a884706  [FLINK-12175] Change filling of typeHierarchy in analyzePojo, 
for correctly creating fields TypeInfo

The 4 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../flink/api/common/typeinfo/TypeInfoFactory.java |   4 +-
 .../api/java/typeutils/TypeExtractionUtils.java|   7 +-
 .../flink/api/java/typeutils/TypeExtractor.java| 255 -
 .../PojoParametrizedTypeExtractionTest.java| 100 
 .../flink/formats/avro/typeutils/AvroTypeInfo.java |  22 +-
 5 files changed, 213 insertions(+), 175 deletions(-)
 create mode 100644 
flink-core/src/test/java/org/apache/flink/api/java/typeutils/PojoParametrizedTypeExtractionTest.java



[flink] branch master updated (1c09c23 -> a884706)

2020-07-30 Thread dwysakowicz
This is an automated email from the ASF dual-hosted git repository.

dwysakowicz pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 1c09c23  [FLINK-16048][avro] Support read/write confluent schema 
registry avro data from Kafka
 new 250a6c1  [hotfix] Remove dead code in TypeExtractor
 new 3309d14  [hotfix] Remove warnings in TypeExtractor, AvroTypeInfo
 new d05960c  [hotfix] Use List instead of ArrayList in TypeExtractor
 new a884706  [FLINK-12175] Change filling of typeHierarchy in analyzePojo, 
for correctly creating fields TypeInfo

The 4 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../flink/api/common/typeinfo/TypeInfoFactory.java |   4 +-
 .../api/java/typeutils/TypeExtractionUtils.java|   7 +-
 .../flink/api/java/typeutils/TypeExtractor.java| 255 -
 .../PojoParametrizedTypeExtractionTest.java| 100 
 .../flink/formats/avro/typeutils/AvroTypeInfo.java |  22 +-
 5 files changed, 213 insertions(+), 175 deletions(-)
 create mode 100644 
flink-core/src/test/java/org/apache/flink/api/java/typeutils/PojoParametrizedTypeExtractionTest.java



[flink] 01/04: [hotfix] Remove dead code in TypeExtractor

2020-07-30 Thread dwysakowicz
This is an automated email from the ASF dual-hosted git repository.

dwysakowicz pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 250a6c1ca911535efb5a03433e3f95daf8be3ff7
Author: Dawid Wysakowicz 
AuthorDate: Wed Jul 29 13:53:54 2020 +0200

[hotfix] Remove dead code in TypeExtractor
---
 .../flink/api/common/typeinfo/TypeInfoFactory.java |  4 +-
 .../flink/api/java/typeutils/TypeExtractor.java| 72 --
 2 files changed, 13 insertions(+), 63 deletions(-)

diff --git 
a/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInfoFactory.java
 
b/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInfoFactory.java
index 898b05e..54f9335 100644
--- 
a/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInfoFactory.java
+++ 
b/flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInfoFactory.java
@@ -19,7 +19,6 @@
 package org.apache.flink.api.common.typeinfo;
 
 import org.apache.flink.annotation.Public;
-import org.apache.flink.api.java.typeutils.TypeExtractor;
 
 import java.lang.reflect.Type;
 import java.util.Map;
@@ -29,8 +28,7 @@ import java.util.Map;
  * plugging-in user-defined {@link TypeInformation} into the Flink type 
system. The factory is
  * called during the type extraction phase if the corresponding type has been 
annotated with
  * {@link TypeInfo}. In a hierarchy of types the closest factory will be 
chosen while traversing
- * upwards, however, a globally registered factory has highest precedence
- * (see {@link TypeExtractor#registerFactory(Type, Class)}).
+ * upwards.
  *
  * @param  type for which {@link TypeInformation} is created
  */
diff --git 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
index adcfdbb..b94dd5f 100644
--- 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
+++ 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
@@ -18,7 +18,6 @@
 
 package org.apache.flink.api.java.typeutils;
 
-import org.apache.commons.lang3.ClassUtils;
 import org.apache.flink.annotation.Internal;
 import org.apache.flink.annotation.Public;
 import org.apache.flink.annotation.PublicEvolving;
@@ -53,6 +52,8 @@ import org.apache.flink.types.Row;
 import org.apache.flink.types.Value;
 import org.apache.flink.util.InstantiationUtil;
 import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.lang3.ClassUtils;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
@@ -72,14 +73,14 @@ import java.util.HashSet;
 import java.util.List;
 import java.util.Map;
 
-import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.getTypeHierarchy;
-import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.hasSuperclass;
-import static org.apache.flink.util.Preconditions.checkNotNull;
 import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.checkAndExtractLambda;
 import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.getAllDeclaredMethods;
+import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.getTypeHierarchy;
+import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.hasSuperclass;
 import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.isClassType;
 import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.sameTypeVars;
 import static 
org.apache.flink.api.java.typeutils.TypeExtractionUtils.typeToClass;
+import static org.apache.flink.util.Preconditions.checkNotNull;
 
 /**
  * A utility for reflection analysis on classes, to determine the return type 
of implementations of transformation
@@ -132,34 +133,6 @@ public class TypeExtractor {
}
 
// 

-   //  TypeInfoFactory registry
-   // 

-
-   private static Map> 
registeredTypeInfoFactories = new HashMap<>();
-
-   /**
-* Registers a type information factory globally for a certain type. 
Every following type extraction
-* operation will use the provided factory for this type. The factory 
will have highest precedence
-* for this type. In a hierarchy of types the registered factory has 
higher precedence than annotations
-* at the same level but lower precedence than factories defined down 
the hierarchy.
-*
-* @param t type for which a new factory is registered
-* @param factory type information factory that will produce {@link 
TypeInformation}
-*/
-   private static void registerFactory(Type t, Class factory) {
-   Preconditions.checkNotNull(t, "Type parameter must not be 
null.");
-   

[flink] 04/04: [FLINK-12175] Change filling of typeHierarchy in analyzePojo, for correctly creating fields TypeInfo

2020-07-30 Thread dwysakowicz
This is an automated email from the ASF dual-hosted git repository.

dwysakowicz pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit a8847061c40bf8ca17e22e6e412a378f53b8b82d
Author: Andrei Bulgakov 
AuthorDate: Tue May 7 12:01:51 2019 +0300

[FLINK-12175] Change filling of typeHierarchy in analyzePojo, for correctly 
creating fields TypeInfo

Co-authored-by: Dawid Wysakowicz 
---
 .../flink/api/java/typeutils/TypeExtractor.java|  37 
 .../PojoParametrizedTypeExtractionTest.java| 100 +
 .../flink/formats/avro/typeutils/AvroTypeInfo.java |  12 +--
 3 files changed, 123 insertions(+), 26 deletions(-)

diff --git 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
index 318f10d..4108ee2 100644
--- 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
+++ 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
@@ -805,8 +805,9 @@ public class TypeExtractor {
 
// go up the hierarchy until we reach immediate child 
of Tuple (with or without generics)
// collect the types while moving up for a later 
top-down
+   List typeHierarchyForSubtypes = new 
ArrayList<>(typeHierarchy);
while (!(isClassType(curT) && 
typeToClass(curT).getSuperclass().equals(Tuple.class))) {
-   typeHierarchy.add(curT);
+   typeHierarchyForSubtypes.add(curT);
curT = typeToClass(curT).getGenericSuperclass();
}
 
@@ -819,24 +820,19 @@ public class TypeExtractor {
throw new InvalidTypesException("Tuple needs to 
be parameterized by using generics.");
}
 
-   typeHierarchy.add(curT);
+   typeHierarchyForSubtypes.add(curT);
 
// create the type information for the subtypes
final TypeInformation[] subTypesInfo = 
createSubTypesInfo(
t,
(ParameterizedType) curT,
-   typeHierarchy,
+   typeHierarchyForSubtypes,
in1Type,
in2Type,
false);
// type needs to be treated a pojo due to additional 
fields
if (subTypesInfo == null) {
-   if (t instanceof ParameterizedType) {
-   return analyzePojo(typeToClass(t), new 
ArrayList<>(typeHierarchy), (ParameterizedType) t, in1Type, in2Type);
-   }
-   else {
-   return analyzePojo(typeToClass(t), new 
ArrayList<>(typeHierarchy), null, in1Type, in2Type);
-   }
+   return analyzePojo(t, new 
ArrayList<>(typeHierarchy), in1Type, in2Type);
}
// return tuple info
return new TupleTypeInfo(typeToClass(t), subTypesInfo);
@@ -1692,7 +1688,8 @@ public class TypeExtractor {
}
 
try {
-   TypeInformation pojoType = analyzePojo(clazz, new 
ArrayList<>(typeHierarchy), parameterizedType, in1Type, in2Type);
+   Type t = parameterizedType != null ? parameterizedType 
: clazz;
+   TypeInformation pojoType = analyzePojo(t, new 
ArrayList<>(typeHierarchy), in1Type, in2Type);
if (pojoType != null) {
return pojoType;
}
@@ -1772,9 +1769,13 @@ public class TypeExtractor {
}
 
@SuppressWarnings("unchecked")
-   protected  TypeInformation analyzePojo(Class 
clazz, List typeHierarchy,
-   ParameterizedType parameterizedType, 
TypeInformation in1Type, TypeInformation in2Type) {
+   protected  TypeInformation analyzePojo(
+   Type type,
+   List typeHierarchy,
+   TypeInformation in1Type,
+   TypeInformation in2Type) {
 
+   Class clazz = typeToClass(type);
if (!Modifier.isPublic(clazz.getModifiers())) {
LOG.info("Class " + clazz.getName() + " is not public 
so it cannot be used as a POJO type " +
"and must be processed as GenericType. Please 
read the Flink documentation " +
@@ -1782,14 +1783,8 @@ public class TypeExtractor {
return new GenericTypeInfo<>(clazz);
}
 
-   

[flink] 02/04: [hotfix] Remove warnings in TypeExtractor, AvroTypeInfo

2020-07-30 Thread dwysakowicz
This is an automated email from the ASF dual-hosted git repository.

dwysakowicz pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 3309d14b1abec8f9303f5158a57eee17c4de92d2
Author: Dawid Wysakowicz 
AuthorDate: Wed Jul 29 14:06:52 2020 +0200

[hotfix] Remove warnings in TypeExtractor, AvroTypeInfo
---
 .../api/java/typeutils/TypeExtractionUtils.java|   7 +-
 .../flink/api/java/typeutils/TypeExtractor.java| 118 ++---
 .../flink/formats/avro/typeutils/AvroTypeInfo.java |  10 +-
 3 files changed, 63 insertions(+), 72 deletions(-)

diff --git 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
index eef309e..600ea8e 100644
--- 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
+++ 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractionUtils.java
@@ -257,12 +257,13 @@ public class TypeExtractionUtils {
/**
 * Convert ParameterizedType or Class to a Class.
 */
-   public static Class typeToClass(Type t) {
+   @SuppressWarnings("unchecked")
+   public static  Class typeToClass(Type t) {
if (t instanceof Class) {
-   return (Class)t;
+   return (Class) t;
}
else if (t instanceof ParameterizedType) {
-   return ((Class) ((ParameterizedType) 
t).getRawType());
+   return ((Class) ((ParameterizedType) 
t).getRawType());
}
throw new IllegalArgumentException("Cannot convert type to 
class");
}
diff --git 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
index b94dd5f..a990f76 100644
--- 
a/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
+++ 
b/flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
@@ -146,7 +146,7 @@ public class TypeExtractor {
String functionName, boolean allowMissing)
{
return getUnaryOperatorReturnType(
-   (Function) mapInterface,
+   mapInterface,
MapFunction.class,
0,
1,
@@ -167,7 +167,7 @@ public class TypeExtractor {
String functionName, boolean allowMissing)
{
return getUnaryOperatorReturnType(
-   (Function) flatMapInterface,
+   flatMapInterface,
FlatMapFunction.class,
0,
1,
@@ -195,7 +195,7 @@ public class TypeExtractor {
public static  TypeInformation 
getFoldReturnTypes(FoldFunction foldInterface, TypeInformation 
inType, String functionName, boolean allowMissing)
{
return getUnaryOperatorReturnType(
-   (Function) foldInterface,
+   foldInterface,
FoldFunction.class,
0,
1,
@@ -251,7 +251,7 @@ public class TypeExtractor {
String functionName, boolean allowMissing)
{
return getUnaryOperatorReturnType(
-   (Function) mapPartitionInterface,
+   mapPartitionInterface,
MapPartitionFunction.class,
0,
1,
@@ -271,7 +271,7 @@ public class TypeExtractor {
String functionName, boolean allowMissing)
{
return getUnaryOperatorReturnType(
-   (Function) groupReduceInterface,
+   groupReduceInterface,
GroupReduceFunction.class,
0,
1,
@@ -291,7 +291,7 @@ public class TypeExtractor {

String 
functionName, boolean allowMissing)
{
return getUnaryOperatorReturnType(
-   (Function) combineInterface,
+   combineInterface,
GroupCombineFunction.class,
0,
1,
@@ -313,7 +313,7 @@ public class TypeExtractor {
TypeInformation in1Type, TypeInformation 
in2Type, String functionName, boolean allowMissing)
{
return getBinaryOperatorReturnType(
-   (Function) joinInterface,
+   joinInterface,
FlatJoinFunction.class,

[flink-docker] 01/02: [FLINK-16260] Add release.metadata files + Java 11 release for 1.11.1

2020-07-30 Thread rmetzger
This is an automated email from the ASF dual-hosted git repository.

rmetzger pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-docker.git

commit 8aafb8413a9675ebbe74fe3e5d22141f26922977
Author: Robert Metzger 
AuthorDate: Thu Jul 30 14:45:01 2020 +0200

[FLINK-16260] Add release.metadata files + Java 11 release for 1.11.1

This closes #35
---
 1.10/scala_2.11-debian/release.metadata | 2 ++
 1.10/scala_2.12-debian/release.metadata | 2 ++
 1.11/{scala_2.11-debian => scala_2.11-java11-debian}/Dockerfile | 2 +-
 .../docker-entrypoint.sh| 0
 1.11/scala_2.11-java11-debian/release.metadata  | 2 ++
 1.11/{scala_2.11-debian => scala_2.11-java8-debian}/Dockerfile  | 0
 .../{scala_2.11-debian => scala_2.11-java8-debian}/docker-entrypoint.sh | 0
 1.11/scala_2.11-java8-debian/release.metadata   | 2 ++
 1.11/{scala_2.12-debian => scala_2.12-java11-debian}/Dockerfile | 2 +-
 .../docker-entrypoint.sh| 0
 1.11/scala_2.12-java11-debian/release.metadata  | 2 ++
 1.11/{scala_2.12-debian => scala_2.12-java8-debian}/Dockerfile  | 0
 .../{scala_2.12-debian => scala_2.12-java8-debian}/docker-entrypoint.sh | 0
 1.11/scala_2.12-java8-debian/release.metadata   | 2 ++
 14 files changed, 14 insertions(+), 2 deletions(-)

diff --git a/1.10/scala_2.11-debian/release.metadata 
b/1.10/scala_2.11-debian/release.metadata
new file mode 100644
index 000..4825226
--- /dev/null
+++ b/1.10/scala_2.11-debian/release.metadata
@@ -0,0 +1,2 @@
+Tags: 1.10.1-scala_2.11, 1.10-scala_2.11, scala_2.11
+Architectures: amd64
diff --git a/1.10/scala_2.12-debian/release.metadata 
b/1.10/scala_2.12-debian/release.metadata
new file mode 100644
index 000..e85e52c
--- /dev/null
+++ b/1.10/scala_2.12-debian/release.metadata
@@ -0,0 +1,2 @@
+Tags: 1.10.1-scala_2.12, 1.10-scala_2.12, scala_2.12, 1.10.1, 1.10, latest
+Architectures: amd64
diff --git a/1.11/scala_2.11-debian/Dockerfile 
b/1.11/scala_2.11-java11-debian/Dockerfile
similarity index 99%
copy from 1.11/scala_2.11-debian/Dockerfile
copy to 1.11/scala_2.11-java11-debian/Dockerfile
index 4f55fcd..952af11 100644
--- a/1.11/scala_2.11-debian/Dockerfile
+++ b/1.11/scala_2.11-java11-debian/Dockerfile
@@ -16,7 +16,7 @@
 # limitations under the License.
 ###
 
-FROM openjdk:8-jre
+FROM openjdk:11-jre
 
 # Install dependencies
 RUN set -ex; \
diff --git a/1.11/scala_2.12-debian/docker-entrypoint.sh 
b/1.11/scala_2.11-java11-debian/docker-entrypoint.sh
similarity index 100%
copy from 1.11/scala_2.12-debian/docker-entrypoint.sh
copy to 1.11/scala_2.11-java11-debian/docker-entrypoint.sh
diff --git a/1.11/scala_2.11-java11-debian/release.metadata 
b/1.11/scala_2.11-java11-debian/release.metadata
new file mode 100644
index 000..db7eaa1
--- /dev/null
+++ b/1.11/scala_2.11-java11-debian/release.metadata
@@ -0,0 +1,2 @@
+Tags: 1.11.1-scala_2.11-java11, 1.11-scala_2.11-java11, scala_2.11-java11
+Architectures: amd64
diff --git a/1.11/scala_2.11-debian/Dockerfile 
b/1.11/scala_2.11-java8-debian/Dockerfile
similarity index 100%
rename from 1.11/scala_2.11-debian/Dockerfile
rename to 1.11/scala_2.11-java8-debian/Dockerfile
diff --git a/1.11/scala_2.11-debian/docker-entrypoint.sh 
b/1.11/scala_2.11-java8-debian/docker-entrypoint.sh
similarity index 100%
rename from 1.11/scala_2.11-debian/docker-entrypoint.sh
rename to 1.11/scala_2.11-java8-debian/docker-entrypoint.sh
diff --git a/1.11/scala_2.11-java8-debian/release.metadata 
b/1.11/scala_2.11-java8-debian/release.metadata
new file mode 100644
index 000..d91cde5
--- /dev/null
+++ b/1.11/scala_2.11-java8-debian/release.metadata
@@ -0,0 +1,2 @@
+Tags: 1.11.1-scala_2.11-java8, 1.11-scala_2.11-java8, scala_2.11-java8, 
1.11.1-scala_2.11, 1.11-scala_2.11, scala_2.11
+Architectures: amd64
diff --git a/1.11/scala_2.12-debian/Dockerfile 
b/1.11/scala_2.12-java11-debian/Dockerfile
similarity index 99%
copy from 1.11/scala_2.12-debian/Dockerfile
copy to 1.11/scala_2.12-java11-debian/Dockerfile
index 2032bad..5fd5b4a 100644
--- a/1.11/scala_2.12-debian/Dockerfile
+++ b/1.11/scala_2.12-java11-debian/Dockerfile
@@ -16,7 +16,7 @@
 # limitations under the License.
 ###
 
-FROM openjdk:8-jre
+FROM openjdk:11-jre
 
 # Install dependencies
 RUN set -ex; \
diff --git a/1.11/scala_2.12-debian/docker-entrypoint.sh 
b/1.11/scala_2.12-java11-debian/docker-entrypoint.sh
similarity index 100%
copy from 1.11/scala_2.12-debian/docker-entrypoint.sh
copy to 1.11/scala_2.12-java11-debian/docker-entrypoint.sh
diff --git a/1.11/scala_2.12-java11-debian/release.metadata 

[flink-docker] 02/02: [hotfix] Fix regex to remove scala_x.xx tags properly

2020-07-30 Thread rmetzger
This is an automated email from the ASF dual-hosted git repository.

rmetzger pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-docker.git

commit d82ab62895fddbeb10c963ef95f0860b2b515ddc
Author: Robert Metzger 
AuthorDate: Thu Jul 30 15:31:41 2020 +0200

[hotfix] Fix regex to remove scala_x.xx tags properly
---
 generate-stackbrew-library.sh | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/generate-stackbrew-library.sh b/generate-stackbrew-library.sh
index 2a7307a..98ffe13 100755
--- a/generate-stackbrew-library.sh
+++ b/generate-stackbrew-library.sh
@@ -47,7 +47,7 @@ pruneTags() {
 else
 # remove "latest" and any "scala_" tag, unless it is the latest version
 # the "scala_" tag has a similar semantic as the "latest" tag in 
docker registries. 
-echo $tags | sed -E 's|, (scala\|latest)[-_[:alnum:]]*||g'
+echo $tags | sed -E 's|, (scala\|latest)[-_.[:alnum:]]*||g'
 fi
 }
 



[flink-docker] branch master updated (e47a802 -> d82ab62)

2020-07-30 Thread rmetzger
This is an automated email from the ASF dual-hosted git repository.

rmetzger pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink-docker.git.


from e47a802  [FLINK-16260] Change generate-stackbrew-library.sh to support 
new release.metadata file
 new 8aafb84  [FLINK-16260] Add release.metadata files + Java 11 release 
for 1.11.1
 new d82ab62  [hotfix] Fix regex to remove scala_x.xx tags properly

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 1.10/scala_2.11-debian/release.metadata | 2 ++
 1.10/scala_2.12-debian/release.metadata | 2 ++
 1.11/{scala_2.11-debian => scala_2.11-java11-debian}/Dockerfile | 2 +-
 .../docker-entrypoint.sh| 0
 1.11/scala_2.11-java11-debian/release.metadata  | 2 ++
 1.11/{scala_2.11-debian => scala_2.11-java8-debian}/Dockerfile  | 0
 .../{scala_2.11-debian => scala_2.11-java8-debian}/docker-entrypoint.sh | 0
 1.11/scala_2.11-java8-debian/release.metadata   | 2 ++
 1.11/{scala_2.12-debian => scala_2.12-java11-debian}/Dockerfile | 2 +-
 .../docker-entrypoint.sh| 0
 1.11/scala_2.12-java11-debian/release.metadata  | 2 ++
 1.11/{scala_2.12-debian => scala_2.12-java8-debian}/Dockerfile  | 0
 .../{scala_2.12-debian => scala_2.12-java8-debian}/docker-entrypoint.sh | 0
 1.11/scala_2.12-java8-debian/release.metadata   | 2 ++
 generate-stackbrew-library.sh   | 2 +-
 15 files changed, 15 insertions(+), 3 deletions(-)
 create mode 100644 1.10/scala_2.11-debian/release.metadata
 create mode 100644 1.10/scala_2.12-debian/release.metadata
 copy 1.11/{scala_2.11-debian => scala_2.11-java11-debian}/Dockerfile (99%)
 copy 1.11/{scala_2.12-debian => scala_2.11-java11-debian}/docker-entrypoint.sh 
(100%)
 create mode 100644 1.11/scala_2.11-java11-debian/release.metadata
 rename 1.11/{scala_2.11-debian => scala_2.11-java8-debian}/Dockerfile (100%)
 rename 1.11/{scala_2.11-debian => 
scala_2.11-java8-debian}/docker-entrypoint.sh (100%)
 create mode 100644 1.11/scala_2.11-java8-debian/release.metadata
 copy 1.11/{scala_2.12-debian => scala_2.12-java11-debian}/Dockerfile (99%)
 copy 1.11/{scala_2.12-debian => scala_2.12-java11-debian}/docker-entrypoint.sh 
(100%)
 create mode 100644 1.11/scala_2.12-java11-debian/release.metadata
 rename 1.11/{scala_2.12-debian => scala_2.12-java8-debian}/Dockerfile (100%)
 rename 1.11/{scala_2.12-debian => 
scala_2.12-java8-debian}/docker-entrypoint.sh (100%)
 create mode 100644 1.11/scala_2.12-java8-debian/release.metadata



[flink-docker] branch dev-1.10 updated: [FLINK-16260] Generate release.metadata file (#34)

2020-07-30 Thread rmetzger
This is an automated email from the ASF dual-hosted git repository.

rmetzger pushed a commit to branch dev-1.10
in repository https://gitbox.apache.org/repos/asf/flink-docker.git


The following commit(s) were added to refs/heads/dev-1.10 by this push:
 new f9c997a  [FLINK-16260] Generate release.metadata file (#34)
f9c997a is described below

commit f9c997ade302883d3347e08b1c6a5f33e23169dd
Author: Robert Metzger <89049+rmetz...@users.noreply.github.com>
AuthorDate: Thu Jul 30 15:39:10 2020 +0200

[FLINK-16260] Generate release.metadata file (#34)
---
 Dockerfile-debian.template  |  2 +-
 add-custom.sh   |  6 ++---
 add-version.sh  |  7 +++--
 common.sh   | 26 ---
 generator.sh| 62 +
 testing/run_travis_tests.sh | 12 -
 6 files changed, 76 insertions(+), 39 deletions(-)

diff --git a/Dockerfile-debian.template b/Dockerfile-debian.template
index 7222314..58607d4 100644
--- a/Dockerfile-debian.template
+++ b/Dockerfile-debian.template
@@ -16,7 +16,7 @@
 # limitations under the License.
 ###
 
-FROM openjdk:8-jre
+FROM %%FROM_IMAGE%%
 
 # Install dependencies
 RUN set -ex; \
diff --git a/add-custom.sh b/add-custom.sh
index c863f3f..12c0d5c 100755
--- a/add-custom.sh
+++ b/add-custom.sh
@@ -4,7 +4,7 @@
 # Flink distribution.
 # This is exlusively for development purposes.
 
-source "$(dirname "$0")"/common.sh
+source "$(dirname "$0")"/generator.sh
 
 function usage() {
 echo >&2 "usage: $0 -u binary-download-url [-n name]"
@@ -43,7 +43,7 @@ echo -n >&2 "Generating Dockerfiles..."
 for source_variant in "${SOURCE_VARIANTS[@]}"; do
   dir="dev/${name}-${source_variant}"
   rm -rf "${dir}"
-
-  generate "${dir}" "${binary_download_url}" "" "" false ${source_variant}
+  mkdir "$dir"
+  generateDockerfile "${dir}" "${binary_download_url}" "" "" false 
${source_variant}
 done
 echo >&2 " done."
diff --git a/add-version.sh b/add-version.sh
index 1cc87ee..c231a05 100755
--- a/add-version.sh
+++ b/add-version.sh
@@ -10,7 +10,6 @@
 #
 # See other repos (e.g. httpd, cassandra) for update.sh examples.
 
-source "$(dirname "$0")"/common.sh
 
 function usage() {
 echo >&2 "usage: $0 -r flink-release -f flink-version"
@@ -98,6 +97,8 @@ fi
 
 mkdir "$flink_release"
 
+source "$(dirname "$0")"/generator.sh
+
 echo -n >&2 "Generating Dockerfiles..."
 for source_variant in "${SOURCE_VARIANTS[@]}"; do
 for scala_version in "${scala_versions[@]}"; do
@@ -109,7 +110,9 @@ for source_variant in "${SOURCE_VARIANTS[@]}"; do
 # Not all mirrors have the .asc files
 flink_asc_url=https://www.apache.org/dist/${flink_url_file_path}.asc
 
-generate "${dir}" "${flink_tgz_url}" "${flink_asc_url}" ${gpg_key} 
true ${source_variant}
+mkdir "$dir"
+generateDockerfile "${dir}" "${flink_tgz_url}" "${flink_asc_url}" 
${gpg_key} true ${source_variant}
+generateReleaseMetadata "${dir}" ${flink_release} ${flink_version} 
${scala_version} ${source_variant}
 done
 done
 echo >&2 " done."
diff --git a/common.sh b/common.sh
deleted file mode 100644
index 0232930..000
--- a/common.sh
+++ /dev/null
@@ -1,26 +0,0 @@
-#!/bin/bash -e
-
-# Defaults, can vary between versions
-export SOURCE_VARIANTS=( debian )
-
-function generate() {
-dir=$1
-binary_download_url=$2
-asc_download_url=$3
-gpg_key=$4
-check_gpg=$5
-source_variant=$6
-
-mkdir "$dir"
-cp docker-entrypoint.sh "$dir/docker-entrypoint.sh"
-
-# '&' has special semantics in sed replacement patterns
-escaped_binary_download_url=$(echo "$binary_download_url" | sed 
's/&/\\\&/')
-
-sed \
--e "s,%%BINARY_DOWNLOAD_URL%%,${escaped_binary_download_url}," \
--e "s,%%ASC_DOWNLOAD_URL%%,$asc_download_url," \
--e "s/%%GPG_KEY%%/$gpg_key/" \
--e "s/%%CHECK_GPG%%/${check_gpg}/" \
-"Dockerfile-$source_variant.template" > "$dir/Dockerfile"
-}
diff --git a/generator.sh b/generator.sh
new file mode 100644
index 000..7393af9
--- /dev/null
+++ b/generator.sh
@@ -0,0 +1,62 @@
+#!/bin/bash -e
+
+export SOURCE_VARIANTS=(debian )
+
+export DEFAULT_SCALA="2.12"
+
+function generateDockerfile {
+# define variables
+dir=$1
+binary_download_url=$2
+asc_download_url=$3
+gpg_key=$4
+check_gpg=$5
+source_variant=$6
+
+from_docker_image="openjdk:8-jre"
+
+cp docker-entrypoint.sh "$dir/docker-entrypoint.sh"
+
+# '&' has special semantics in sed replacement patterns
+escaped_binary_download_url=$(echo "$binary_download_url" | sed 
's/&/\\\&/')
+
+# generate Dockerfile
+sed \
+-e "s,%%BINARY_DOWNLOAD_URL%%,${escaped_binary_download_url}," \
+-e "s,%%ASC_DOWNLOAD_URL%%,$asc_download_url," \
+-e "s/%%GPG_KEY%%/$gpg_key/" \
+-e "s/%%CHECK_GPG%%/${check_gpg}/" \
+-e 

[flink-docker] branch dev-1.11 updated (4271527 -> fb4e178)

2020-07-30 Thread rmetzger
This is an automated email from the ASF dual-hosted git repository.

rmetzger pushed a change to branch dev-1.11
in repository https://gitbox.apache.org/repos/asf/flink-docker.git.


from 4271527  [FLINK-18664] Add GPG key for 1.11.1 release
 add c0a719b  [FLINK-16260] Add support for generating Java 11 dockerfiles
 add fb4e178  [FLINK-16260] Generate release.metadata file

No new revisions were added by this update.

Summary of changes:
 Dockerfile-debian.template  |  2 +-
 add-custom.sh   | 14 ---
 add-version.sh  | 20 ++
 common.sh   | 26 -
 generator.sh| 90 +
 testing/run_travis_tests.sh | 16 
 6 files changed, 120 insertions(+), 48 deletions(-)
 delete mode 100644 common.sh
 create mode 100644 generator.sh



[flink] branch master updated (a0227e2 -> 1c09c23)

2020-07-30 Thread dwysakowicz
This is an automated email from the ASF dual-hosted git repository.

dwysakowicz pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from a0227e2  [FLINK-18362][yarn] Fix mistakenly merged commit 0e10fd5b8ee0
 add 1c09c23  [FLINK-16048][avro] Support read/write confluent schema 
registry avro data from Kafka

No new revisions were added by this update.

Summary of changes:
 .../flink-avro-confluent-registry/pom.xml  |  56 +++-
 .../confluent/CachedSchemaCoderProvider.java   |  76 ++
 ...ConfluentRegistryAvroDeserializationSchema.java |  20 --
 .../ConfluentRegistryAvroSerializationSchema.java  |  22 --
 .../confluent/RegistryAvroFormatFactory.java}  |  86 +++---
 .../registry/confluent/RegistryAvroOptions.java|  30 +--
 .../src/main/resources/META-INF/NOTICE |   4 +-
 .../org.apache.flink.table.factories.Factory   |   2 +-
 .../confluent/RegistryAvroFormatFactoryTest.java}  | 107 +---
 .../RegistryAvroRowDataSeDeSchemaTest.java | 199 ++
 .../formats/avro/AvroDeserializationSchema.java|  19 ++
 .../formats/avro/AvroFileSystemFormatFactory.java  |   8 +-
 .../avro/AvroRowDataDeserializationSchema.java | 297 +++--
 .../avro/AvroRowDataSerializationSchema.java   | 232 ++--
 .../formats/avro/AvroSerializationSchema.java  |  19 ++
 .../formats/avro/AvroToRowDataConverters.java  | 248 +
 .../avro/RegistryAvroDeserializationSchema.java|  23 +-
 .../avro/RegistryAvroSerializationSchema.java  |  23 +-
 .../formats/avro/RowDataToAvroConverters.java  | 203 ++
 .../avro/typeutils/AvroSchemaConverter.java| 122 +
 .../avro/typeutils/AvroSchemaConverterTest.java|  49 
 21 files changed, 1217 insertions(+), 628 deletions(-)
 create mode 100644 
flink-formats/flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/CachedSchemaCoderProvider.java
 copy 
flink-formats/{flink-json/src/main/java/org/apache/flink/formats/json/JsonFormatFactory.java
 => 
flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroFormatFactory.java}
 (62%)
 copy 
flink-kubernetes/src/main/java/org/apache/flink/kubernetes/configuration/KubernetesConfigOptionsInternal.java
 => 
flink-formats/flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroOptions.java
 (55%)
 copy {flink-connectors/flink-connector-kafka => 
flink-formats/flink-avro-confluent-registry}/src/main/resources/META-INF/services/org.apache.flink.table.factories.Factory
 (91%)
 copy 
flink-formats/{flink-json/src/test/java/org/apache/flink/formats/json/canal/CanalJsonFormatFactoryTest.java
 => 
flink-avro-confluent-registry/src/test/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroFormatFactoryTest.java}
 (54%)
 create mode 100644 
flink-formats/flink-avro-confluent-registry/src/test/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroRowDataSeDeSchemaTest.java
 create mode 100644 
flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/AvroToRowDataConverters.java
 create mode 100644 
flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/RowDataToAvroConverters.java



[flink] branch master updated (a0227e2 -> 1c09c23)

2020-07-30 Thread dwysakowicz
This is an automated email from the ASF dual-hosted git repository.

dwysakowicz pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from a0227e2  [FLINK-18362][yarn] Fix mistakenly merged commit 0e10fd5b8ee0
 add 1c09c23  [FLINK-16048][avro] Support read/write confluent schema 
registry avro data from Kafka

No new revisions were added by this update.

Summary of changes:
 .../flink-avro-confluent-registry/pom.xml  |  56 +++-
 .../confluent/CachedSchemaCoderProvider.java   |  76 ++
 ...ConfluentRegistryAvroDeserializationSchema.java |  20 --
 .../ConfluentRegistryAvroSerializationSchema.java  |  22 --
 .../confluent/RegistryAvroFormatFactory.java}  |  86 +++---
 .../registry/confluent/RegistryAvroOptions.java|  30 +--
 .../src/main/resources/META-INF/NOTICE |   4 +-
 .../org.apache.flink.table.factories.Factory   |   2 +-
 .../confluent/RegistryAvroFormatFactoryTest.java}  | 107 +---
 .../RegistryAvroRowDataSeDeSchemaTest.java | 199 ++
 .../formats/avro/AvroDeserializationSchema.java|  19 ++
 .../formats/avro/AvroFileSystemFormatFactory.java  |   8 +-
 .../avro/AvroRowDataDeserializationSchema.java | 297 +++--
 .../avro/AvroRowDataSerializationSchema.java   | 232 ++--
 .../formats/avro/AvroSerializationSchema.java  |  19 ++
 .../formats/avro/AvroToRowDataConverters.java  | 248 +
 .../avro/RegistryAvroDeserializationSchema.java|  23 +-
 .../avro/RegistryAvroSerializationSchema.java  |  23 +-
 .../formats/avro/RowDataToAvroConverters.java  | 203 ++
 .../avro/typeutils/AvroSchemaConverter.java| 122 +
 .../avro/typeutils/AvroSchemaConverterTest.java|  49 
 21 files changed, 1217 insertions(+), 628 deletions(-)
 create mode 100644 
flink-formats/flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/CachedSchemaCoderProvider.java
 copy 
flink-formats/{flink-json/src/main/java/org/apache/flink/formats/json/JsonFormatFactory.java
 => 
flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroFormatFactory.java}
 (62%)
 copy 
flink-kubernetes/src/main/java/org/apache/flink/kubernetes/configuration/KubernetesConfigOptionsInternal.java
 => 
flink-formats/flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroOptions.java
 (55%)
 copy {flink-connectors/flink-connector-kafka => 
flink-formats/flink-avro-confluent-registry}/src/main/resources/META-INF/services/org.apache.flink.table.factories.Factory
 (91%)
 copy 
flink-formats/{flink-json/src/test/java/org/apache/flink/formats/json/canal/CanalJsonFormatFactoryTest.java
 => 
flink-avro-confluent-registry/src/test/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroFormatFactoryTest.java}
 (54%)
 create mode 100644 
flink-formats/flink-avro-confluent-registry/src/test/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroRowDataSeDeSchemaTest.java
 create mode 100644 
flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/AvroToRowDataConverters.java
 create mode 100644 
flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/RowDataToAvroConverters.java



[flink] branch master updated (a0227e2 -> 1c09c23)

2020-07-30 Thread dwysakowicz
This is an automated email from the ASF dual-hosted git repository.

dwysakowicz pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from a0227e2  [FLINK-18362][yarn] Fix mistakenly merged commit 0e10fd5b8ee0
 add 1c09c23  [FLINK-16048][avro] Support read/write confluent schema 
registry avro data from Kafka

No new revisions were added by this update.

Summary of changes:
 .../flink-avro-confluent-registry/pom.xml  |  56 +++-
 .../confluent/CachedSchemaCoderProvider.java   |  76 ++
 ...ConfluentRegistryAvroDeserializationSchema.java |  20 --
 .../ConfluentRegistryAvroSerializationSchema.java  |  22 --
 .../confluent/RegistryAvroFormatFactory.java}  |  86 +++---
 .../registry/confluent/RegistryAvroOptions.java|  30 +--
 .../src/main/resources/META-INF/NOTICE |   4 +-
 .../org.apache.flink.table.factories.Factory   |   2 +-
 .../confluent/RegistryAvroFormatFactoryTest.java}  | 107 +---
 .../RegistryAvroRowDataSeDeSchemaTest.java | 199 ++
 .../formats/avro/AvroDeserializationSchema.java|  19 ++
 .../formats/avro/AvroFileSystemFormatFactory.java  |   8 +-
 .../avro/AvroRowDataDeserializationSchema.java | 297 +++--
 .../avro/AvroRowDataSerializationSchema.java   | 232 ++--
 .../formats/avro/AvroSerializationSchema.java  |  19 ++
 .../formats/avro/AvroToRowDataConverters.java  | 248 +
 .../avro/RegistryAvroDeserializationSchema.java|  23 +-
 .../avro/RegistryAvroSerializationSchema.java  |  23 +-
 .../formats/avro/RowDataToAvroConverters.java  | 203 ++
 .../avro/typeutils/AvroSchemaConverter.java| 122 +
 .../avro/typeutils/AvroSchemaConverterTest.java|  49 
 21 files changed, 1217 insertions(+), 628 deletions(-)
 create mode 100644 
flink-formats/flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/CachedSchemaCoderProvider.java
 copy 
flink-formats/{flink-json/src/main/java/org/apache/flink/formats/json/JsonFormatFactory.java
 => 
flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroFormatFactory.java}
 (62%)
 copy 
flink-kubernetes/src/main/java/org/apache/flink/kubernetes/configuration/KubernetesConfigOptionsInternal.java
 => 
flink-formats/flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroOptions.java
 (55%)
 copy {flink-connectors/flink-connector-kafka => 
flink-formats/flink-avro-confluent-registry}/src/main/resources/META-INF/services/org.apache.flink.table.factories.Factory
 (91%)
 copy 
flink-formats/{flink-json/src/test/java/org/apache/flink/formats/json/canal/CanalJsonFormatFactoryTest.java
 => 
flink-avro-confluent-registry/src/test/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroFormatFactoryTest.java}
 (54%)
 create mode 100644 
flink-formats/flink-avro-confluent-registry/src/test/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroRowDataSeDeSchemaTest.java
 create mode 100644 
flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/AvroToRowDataConverters.java
 create mode 100644 
flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/RowDataToAvroConverters.java



[flink] branch master updated: [FLINK-18362][yarn] Fix mistakenly merged commit 0e10fd5b8ee0

2020-07-30 Thread kkloudas
This is an automated email from the ASF dual-hosted git repository.

kkloudas pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new a0227e2  [FLINK-18362][yarn] Fix mistakenly merged commit 0e10fd5b8ee0
a0227e2 is described below

commit a0227e20430ee9eaff59464023de2385378f71ea
Author: Kostas Kloudas 
AuthorDate: Thu Jul 30 14:08:43 2020 +0200

[FLINK-18362][yarn] Fix mistakenly merged commit 0e10fd5b8ee0
---
 .../generated/yarn_config_configuration.html   |   2 +-
 .../test/java/org/apache/flink/yarn/UtilsTest.java |   4 +-
 .../java/org/apache/flink/yarn/YARNITCase.java |  14 +-
 .../flink/yarn/testjob/YarnTestArchiveJob.java | 146 +
 .../flink/yarn/YarnApplicationFileUploader.java|  87 
 .../apache/flink/yarn/YarnClusterDescriptor.java   |  53 
 .../flink/yarn/YarnLocalResourceDescriptor.java|  21 +--
 .../yarn/configuration/YarnConfigOptions.java  |   3 +-
 .../test/java/org/apache/flink/yarn/UtilsTest.java |   1 -
 .../org/apache/flink/yarn/YarnFileStageTest.java   |   9 +-
 .../yarn/YarnLocalResourceDescriptionTest.java |   4 +-
 11 files changed, 232 insertions(+), 112 deletions(-)

diff --git a/docs/_includes/generated/yarn_config_configuration.html 
b/docs/_includes/generated/yarn_config_configuration.html
index 8126629..7173a0e 100644
--- a/docs/_includes/generated/yarn_config_configuration.html
+++ b/docs/_includes/generated/yarn_config_configuration.html
@@ -138,7 +138,7 @@
 yarn.ship-archives
 (none)
 ListString
-A semicolon-separated list of archives to be shipped to the 
YARN cluster. They will be un-packed when localizing.
+A semicolon-separated list of archives to be shipped to the 
YARN cluster. These archives will be un-packed when localizing and they can be 
any of the following types: ".tar.gz", ".tar", ".tgz", ".dst", ".jar", 
".zip".
 
 
 yarn.ship-directories
diff --git 
a/flink-yarn-tests/src/test/java/org/apache/flink/yarn/UtilsTest.java 
b/flink-yarn-tests/src/test/java/org/apache/flink/yarn/UtilsTest.java
index 76aac0e..a2ad133 100644
--- a/flink-yarn-tests/src/test/java/org/apache/flink/yarn/UtilsTest.java
+++ b/flink-yarn-tests/src/test/java/org/apache/flink/yarn/UtilsTest.java
@@ -33,6 +33,7 @@ import org.apache.hadoop.security.Credentials;
 import org.apache.hadoop.security.token.Token;
 import org.apache.hadoop.security.token.TokenIdentifier;
 import org.apache.hadoop.yarn.api.records.ContainerLaunchContext;
+import org.apache.hadoop.yarn.api.records.LocalResourceType;
 import org.apache.hadoop.yarn.api.records.LocalResourceVisibility;
 import org.apache.hadoop.yarn.conf.YarnConfiguration;
 import org.apache.hadoop.yarn.security.AMRMTokenIdentifier;
@@ -100,7 +101,8 @@ public class UtilsTest extends TestLogger {
new Path(root.toURI()),
0,
System.currentTimeMillis(),
-   LocalResourceVisibility.APPLICATION).toString());
+   LocalResourceVisibility.APPLICATION,
+   LocalResourceType.FILE).toString());
env = Collections.unmodifiableMap(env);
 
File credentialFile = 
temporaryFolder.newFile("container_tokens");
diff --git 
a/flink-yarn-tests/src/test/java/org/apache/flink/yarn/YARNITCase.java 
b/flink-yarn-tests/src/test/java/org/apache/flink/yarn/YARNITCase.java
index 88d91ca..4a9db23 100644
--- a/flink-yarn-tests/src/test/java/org/apache/flink/yarn/YARNITCase.java
+++ b/flink-yarn-tests/src/test/java/org/apache/flink/yarn/YARNITCase.java
@@ -31,6 +31,7 @@ import org.apache.flink.runtime.jobmaster.JobResult;
 import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
 import org.apache.flink.streaming.api.functions.sink.DiscardingSink;
 import org.apache.flink.yarn.configuration.YarnConfigOptions;
+import org.apache.flink.yarn.testjob.YarnTestArchiveJob;
 import org.apache.flink.yarn.testjob.YarnTestCacheJob;
 import org.apache.flink.yarn.util.TestUtils;
 
@@ -39,7 +40,9 @@ import org.apache.hadoop.fs.Path;
 import org.apache.hadoop.fs.permission.FsPermission;
 import org.apache.hadoop.yarn.api.records.ApplicationId;
 import org.junit.BeforeClass;
+import org.junit.Rule;
 import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
 
 import java.io.File;
 import java.io.IOException;
@@ -63,6 +66,9 @@ public class YARNITCase extends YarnTestBase {
private static final Duration yarnAppTerminateTimeout = 
Duration.ofSeconds(10);
private static final int sleepIntervalInMS = 100;
 
+   @Rule
+   public TemporaryFolder temporaryFolder = new TemporaryFolder();
+
@BeforeClass
public static void setup() {
YARN_CONFIGURATION.set(YarnTestBase.TEST_CLUSTER_NAME_KEY, 

[flink] branch master updated: [FLINK-18362][yarn] Fix mistakenly merged commit 0e10fd5b8ee0

2020-07-30 Thread kkloudas
This is an automated email from the ASF dual-hosted git repository.

kkloudas pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new a0227e2  [FLINK-18362][yarn] Fix mistakenly merged commit 0e10fd5b8ee0
a0227e2 is described below

commit a0227e20430ee9eaff59464023de2385378f71ea
Author: Kostas Kloudas 
AuthorDate: Thu Jul 30 14:08:43 2020 +0200

[FLINK-18362][yarn] Fix mistakenly merged commit 0e10fd5b8ee0
---
 .../generated/yarn_config_configuration.html   |   2 +-
 .../test/java/org/apache/flink/yarn/UtilsTest.java |   4 +-
 .../java/org/apache/flink/yarn/YARNITCase.java |  14 +-
 .../flink/yarn/testjob/YarnTestArchiveJob.java | 146 +
 .../flink/yarn/YarnApplicationFileUploader.java|  87 
 .../apache/flink/yarn/YarnClusterDescriptor.java   |  53 
 .../flink/yarn/YarnLocalResourceDescriptor.java|  21 +--
 .../yarn/configuration/YarnConfigOptions.java  |   3 +-
 .../test/java/org/apache/flink/yarn/UtilsTest.java |   1 -
 .../org/apache/flink/yarn/YarnFileStageTest.java   |   9 +-
 .../yarn/YarnLocalResourceDescriptionTest.java |   4 +-
 11 files changed, 232 insertions(+), 112 deletions(-)

diff --git a/docs/_includes/generated/yarn_config_configuration.html 
b/docs/_includes/generated/yarn_config_configuration.html
index 8126629..7173a0e 100644
--- a/docs/_includes/generated/yarn_config_configuration.html
+++ b/docs/_includes/generated/yarn_config_configuration.html
@@ -138,7 +138,7 @@
 yarn.ship-archives
 (none)
 ListString
-A semicolon-separated list of archives to be shipped to the 
YARN cluster. They will be un-packed when localizing.
+A semicolon-separated list of archives to be shipped to the 
YARN cluster. These archives will be un-packed when localizing and they can be 
any of the following types: ".tar.gz", ".tar", ".tgz", ".dst", ".jar", 
".zip".
 
 
 yarn.ship-directories
diff --git 
a/flink-yarn-tests/src/test/java/org/apache/flink/yarn/UtilsTest.java 
b/flink-yarn-tests/src/test/java/org/apache/flink/yarn/UtilsTest.java
index 76aac0e..a2ad133 100644
--- a/flink-yarn-tests/src/test/java/org/apache/flink/yarn/UtilsTest.java
+++ b/flink-yarn-tests/src/test/java/org/apache/flink/yarn/UtilsTest.java
@@ -33,6 +33,7 @@ import org.apache.hadoop.security.Credentials;
 import org.apache.hadoop.security.token.Token;
 import org.apache.hadoop.security.token.TokenIdentifier;
 import org.apache.hadoop.yarn.api.records.ContainerLaunchContext;
+import org.apache.hadoop.yarn.api.records.LocalResourceType;
 import org.apache.hadoop.yarn.api.records.LocalResourceVisibility;
 import org.apache.hadoop.yarn.conf.YarnConfiguration;
 import org.apache.hadoop.yarn.security.AMRMTokenIdentifier;
@@ -100,7 +101,8 @@ public class UtilsTest extends TestLogger {
new Path(root.toURI()),
0,
System.currentTimeMillis(),
-   LocalResourceVisibility.APPLICATION).toString());
+   LocalResourceVisibility.APPLICATION,
+   LocalResourceType.FILE).toString());
env = Collections.unmodifiableMap(env);
 
File credentialFile = 
temporaryFolder.newFile("container_tokens");
diff --git 
a/flink-yarn-tests/src/test/java/org/apache/flink/yarn/YARNITCase.java 
b/flink-yarn-tests/src/test/java/org/apache/flink/yarn/YARNITCase.java
index 88d91ca..4a9db23 100644
--- a/flink-yarn-tests/src/test/java/org/apache/flink/yarn/YARNITCase.java
+++ b/flink-yarn-tests/src/test/java/org/apache/flink/yarn/YARNITCase.java
@@ -31,6 +31,7 @@ import org.apache.flink.runtime.jobmaster.JobResult;
 import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
 import org.apache.flink.streaming.api.functions.sink.DiscardingSink;
 import org.apache.flink.yarn.configuration.YarnConfigOptions;
+import org.apache.flink.yarn.testjob.YarnTestArchiveJob;
 import org.apache.flink.yarn.testjob.YarnTestCacheJob;
 import org.apache.flink.yarn.util.TestUtils;
 
@@ -39,7 +40,9 @@ import org.apache.hadoop.fs.Path;
 import org.apache.hadoop.fs.permission.FsPermission;
 import org.apache.hadoop.yarn.api.records.ApplicationId;
 import org.junit.BeforeClass;
+import org.junit.Rule;
 import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
 
 import java.io.File;
 import java.io.IOException;
@@ -63,6 +66,9 @@ public class YARNITCase extends YarnTestBase {
private static final Duration yarnAppTerminateTimeout = 
Duration.ofSeconds(10);
private static final int sleepIntervalInMS = 100;
 
+   @Rule
+   public TemporaryFolder temporaryFolder = new TemporaryFolder();
+
@BeforeClass
public static void setup() {
YARN_CONFIGURATION.set(YarnTestBase.TEST_CLUSTER_NAME_KEY, 

[flink-docker] 01/01: [FLINK-16260] Generate release.metadata file

2020-07-30 Thread rmetzger
This is an automated email from the ASF dual-hosted git repository.

rmetzger pushed a commit to branch dev-master
in repository https://gitbox.apache.org/repos/asf/flink-docker.git

commit 15491755aef85694adafe673a98196e7754e628b
Author: Robert Metzger 
AuthorDate: Mon Jul 27 15:28:24 2020 +0200

[FLINK-16260] Generate release.metadata file

This closes #32
---
 add-custom.sh   | 14 ---
 add-version.sh  | 20 ++
 common.sh   | 35 --
 generator.sh| 90 +
 testing/run_travis_tests.sh | 19 +-
 5 files changed, 122 insertions(+), 56 deletions(-)

diff --git a/add-custom.sh b/add-custom.sh
index c863f3f..e8e2a5e 100755
--- a/add-custom.sh
+++ b/add-custom.sh
@@ -4,16 +4,17 @@
 # Flink distribution.
 # This is exlusively for development purposes.
 
-source "$(dirname "$0")"/common.sh
+source "$(dirname "$0")"/generator.sh
 
 function usage() {
-echo >&2 "usage: $0 -u binary-download-url [-n name]"
+echo >&2 "usage: $0 -u binary-download-url [-n name] [-j java_version]"
 }
 
 binary_download_url=
 name=custom
+java_version=8
 
-while getopts u:n:h arg; do
+while getopts u:n:j:h arg; do
   case "$arg" in
 u)
   binary_download_url=$OPTARG
@@ -21,6 +22,9 @@ while getopts u:n:h arg; do
 n)
   name=$OPTARG
   ;;
+j)
+  java_version=$OPTARG
+  ;;
 h)
   usage
   exit 0
@@ -43,7 +47,7 @@ echo -n >&2 "Generating Dockerfiles..."
 for source_variant in "${SOURCE_VARIANTS[@]}"; do
   dir="dev/${name}-${source_variant}"
   rm -rf "${dir}"
-
-  generate "${dir}" "${binary_download_url}" "" "" false ${source_variant}
+  mkdir "$dir"
+  generateDockerfile "${dir}" "${binary_download_url}" "" "" false 
${java_version} ${source_variant}
 done
 echo >&2 " done."
diff --git a/add-version.sh b/add-version.sh
index 48acc79..fd07d92 100755
--- a/add-version.sh
+++ b/add-version.sh
@@ -10,7 +10,6 @@
 #
 # See other repos (e.g. httpd, cassandra) for update.sh examples.
 
-source "$(dirname "$0")"/common.sh
 
 function usage() {
 echo >&2 "usage: $0 -r flink-release -f flink-version"
@@ -60,6 +59,7 @@ fi
 
 # Defaults, can vary between versions
 scala_versions=( 2.11 2.12 )
+java_versions=( 8 11 )
 gpg_key=
 
 # Version-specific variants (example)
@@ -98,18 +98,24 @@ fi
 
 mkdir "$flink_release"
 
+source "$(dirname "$0")"/generator.sh
+
 echo -n >&2 "Generating Dockerfiles..."
 for source_variant in "${SOURCE_VARIANTS[@]}"; do
 for scala_version in "${scala_versions[@]}"; do
-dir="$flink_release/scala_${scala_version}-${source_variant}"
+for java_version in "${java_versions[@]}"; do
+
dir="$flink_release/scala_${scala_version}-java${java_version}-${source_variant}"
 
-
flink_url_file_path=flink/flink-${flink_version}/flink-${flink_version}-bin-scala_${scala_version}.tgz
+
flink_url_file_path=flink/flink-${flink_version}/flink-${flink_version}-bin-scala_${scala_version}.tgz
 
-
flink_tgz_url="https://www.apache.org/dyn/closer.cgi?action=download=${flink_url_file_path};
-# Not all mirrors have the .asc files
-flink_asc_url=https://www.apache.org/dist/${flink_url_file_path}.asc
+
flink_tgz_url="https://www.apache.org/dyn/closer.cgi?action=download=${flink_url_file_path};
+# Not all mirrors have the .asc files
+
flink_asc_url=https://www.apache.org/dist/${flink_url_file_path}.asc
 
-generate "${dir}" "${flink_tgz_url}" "${flink_asc_url}" ${gpg_key} 
true ${source_variant}
+mkdir "$dir"
+generateDockerfile "${dir}" "${flink_tgz_url}" "${flink_asc_url}" 
${gpg_key} true ${java_version} ${source_variant}
+generateReleaseMetadata "${dir}" ${flink_release} ${flink_version} 
${scala_version} ${java_version} ${source_variant}
+done
 done
 done
 echo >&2 " done."
diff --git a/common.sh b/common.sh
deleted file mode 100644
index dcf5052..000
--- a/common.sh
+++ /dev/null
@@ -1,35 +0,0 @@
-#!/bin/bash -e
-
-# Defaults, can vary between versions
-export SOURCE_VARIANTS=(java11-debian debian )
-
-function generate() {
-dir=$1
-binary_download_url=$2
-asc_download_url=$3
-gpg_key=$4
-check_gpg=$5
-source_variant=$6
-
-from_docker_image="openjdk:8-jre"
-if [[ $source_variant =~ "java11" ]] ; then
-from_docker_image="openjdk:11-jre"
-fi
-
-source_file="Dockerfile-debian"
-
-mkdir "$dir"
-cp docker-entrypoint.sh "$dir/docker-entrypoint.sh"
-
-# '&' has special semantics in sed replacement patterns
-escaped_binary_download_url=$(echo "$binary_download_url" | sed 
's/&/\\\&/')
-
-# generate Dockerfile
-sed \
--e "s,%%BINARY_DOWNLOAD_URL%%,${escaped_binary_download_url}," \
--e "s,%%ASC_DOWNLOAD_URL%%,$asc_download_url," \
--e "s/%%GPG_KEY%%/$gpg_key/" \
--e 

[flink-docker] branch dev-master updated (95c87d1 -> 1549175)

2020-07-30 Thread rmetzger
This is an automated email from the ASF dual-hosted git repository.

rmetzger pushed a change to branch dev-master
in repository https://gitbox.apache.org/repos/asf/flink-docker.git.


from 95c87d1  [FLINK-18497] Add GPG key for 1.11.0 release
 add d7b4c98  [FLINK-16260] Add support for generating Java 11 dockerfiles
 new 1549175  [FLINK-16260] Generate release.metadata file

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 Dockerfile-debian.template  |  2 +-
 add-custom.sh   | 14 ---
 add-version.sh  | 20 ++
 common.sh   | 26 -
 generator.sh| 90 +
 testing/run_travis_tests.sh | 19 +-
 6 files changed, 123 insertions(+), 48 deletions(-)
 delete mode 100644 common.sh
 create mode 100644 generator.sh



[flink] branch master updated (bb66409 -> 0e10fd5)

2020-07-30 Thread kkloudas
This is an automated email from the ASF dual-hosted git repository.

kkloudas pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from bb66409  [hotfix][docs] Update currentInputNWatermark metrics docs 
including N-ary operator
 add 0e10fd5  [FLINK-18362][FLINK-13838][yarn] Add yarn.ship-archives to 
support LocalResourceType.ARCHIVE

No new revisions were added by this update.

Summary of changes:
 .../generated/yarn_config_configuration.html   |  6 ++
 .../src/main/java/org/apache/flink/yarn/Utils.java | 19 +++--
 .../flink/yarn/YarnApplicationFileUploader.java| 86 --
 .../apache/flink/yarn/YarnClusterDescriptor.java   | 42 ++-
 .../flink/yarn/YarnLocalResourceDescriptor.java| 44 ---
 .../yarn/configuration/YarnConfigOptions.java  |  8 ++
 .../test/java/org/apache/flink/yarn/UtilsTest.java |  1 +
 .../yarn/YarnLocalResourceDescriptionTest.java |  7 +-
 .../apache/flink/yarn/YarnResourceManagerTest.java |  4 +-
 9 files changed, 175 insertions(+), 42 deletions(-)



[flink] branch master updated (bb66409 -> 0e10fd5)

2020-07-30 Thread kkloudas
This is an automated email from the ASF dual-hosted git repository.

kkloudas pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from bb66409  [hotfix][docs] Update currentInputNWatermark metrics docs 
including N-ary operator
 add 0e10fd5  [FLINK-18362][FLINK-13838][yarn] Add yarn.ship-archives to 
support LocalResourceType.ARCHIVE

No new revisions were added by this update.

Summary of changes:
 .../generated/yarn_config_configuration.html   |  6 ++
 .../src/main/java/org/apache/flink/yarn/Utils.java | 19 +++--
 .../flink/yarn/YarnApplicationFileUploader.java| 86 --
 .../apache/flink/yarn/YarnClusterDescriptor.java   | 42 ++-
 .../flink/yarn/YarnLocalResourceDescriptor.java| 44 ---
 .../yarn/configuration/YarnConfigOptions.java  |  8 ++
 .../test/java/org/apache/flink/yarn/UtilsTest.java |  1 +
 .../yarn/YarnLocalResourceDescriptionTest.java |  7 +-
 .../apache/flink/yarn/YarnResourceManagerTest.java |  4 +-
 9 files changed, 175 insertions(+), 42 deletions(-)



[flink] branch master updated: [hotfix][docs] Update currentInputNWatermark metrics docs including N-ary operator

2020-07-30 Thread pnowojski
This is an automated email from the ASF dual-hosted git repository.

pnowojski pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new bb66409  [hotfix][docs] Update currentInputNWatermark metrics docs 
including N-ary operator
bb66409 is described below

commit bb6640954d6e38b20c0466cff651a621d9753f7a
Author: Piotr Nowojski 
AuthorDate: Wed Jul 29 15:52:25 2020 +0200

[hotfix][docs] Update currentInputNWatermark metrics docs including N-ary 
operator
---
 docs/monitoring/metrics.md| 16 
 docs/monitoring/metrics.zh.md | 16 
 2 files changed, 8 insertions(+), 24 deletions(-)

diff --git a/docs/monitoring/metrics.md b/docs/monitoring/metrics.md
index 3d172de..9b348f4 100644
--- a/docs/monitoring/metrics.md
+++ b/docs/monitoring/metrics.md
@@ -1471,19 +1471,11 @@ Certain RocksDB native metrics are available but 
disabled by default, you can fi
   Gauge
 
 
-  Operator
-  currentInput1Watermark
+  Operator
+  currentInputNWatermark
   
-The last watermark this operator has received in its first input (in 
milliseconds).
-Note: Only for operators with 2 inputs.
-  
-  Gauge
-
-
-  currentInput2Watermark
-  
-The last watermark this operator has received in its second input (in 
milliseconds).
-Note: Only for operators with 2 inputs.
+The last watermark this operator has received in its 
N'th input (in milliseconds), with index N 
starting from 1. For example currentInput1Watermark, 
currentInput2Watermark, ...
+Note: Only for operators with 2 or more inputs.
   
   Gauge
 
diff --git a/docs/monitoring/metrics.zh.md b/docs/monitoring/metrics.zh.md
index fe65d25..b2ecbcd 100644
--- a/docs/monitoring/metrics.zh.md
+++ b/docs/monitoring/metrics.zh.md
@@ -1470,19 +1470,11 @@ Certain RocksDB native metrics are available but 
disabled by default, you can fi
   Gauge
 
 
-  Operator
-  currentInput1Watermark
+  Operator
+  currentInputNWatermark
   
-The last watermark this operator has received in its first input (in 
milliseconds).
-Note: Only for operators with 2 inputs.
-  
-  Gauge
-
-
-  currentInput2Watermark
-  
-The last watermark this operator has received in its second input (in 
milliseconds).
-Note: Only for operators with 2 inputs.
+The last watermark this operator has received in its 
N'th input (in milliseconds), with index N 
starting from 1. For example currentInput1Watermark, 
currentInput2Watermark, ...
+Note: Only for operators with 2 or more inputs.
   
   Gauge
 



[flink] branch release-1.11 updated: [hotfix][docs] Update currentInputNWatermark metrics docs including N-ary operator

2020-07-30 Thread pnowojski
This is an automated email from the ASF dual-hosted git repository.

pnowojski pushed a commit to branch release-1.11
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.11 by this push:
 new ae041d9  [hotfix][docs] Update currentInputNWatermark metrics docs 
including N-ary operator
ae041d9 is described below

commit ae041d9cfa02065a083978d34fb464181b1cb1c2
Author: Piotr Nowojski 
AuthorDate: Wed Jul 29 15:52:25 2020 +0200

[hotfix][docs] Update currentInputNWatermark metrics docs including N-ary 
operator
---
 docs/monitoring/metrics.md| 16 
 docs/monitoring/metrics.zh.md | 16 
 2 files changed, 8 insertions(+), 24 deletions(-)

diff --git a/docs/monitoring/metrics.md b/docs/monitoring/metrics.md
index ebb3aa0..882a52c 100644
--- a/docs/monitoring/metrics.md
+++ b/docs/monitoring/metrics.md
@@ -1470,19 +1470,11 @@ Certain RocksDB native metrics are available but 
disabled by default, you can fi
   Gauge
 
 
-  Operator
-  currentInput1Watermark
+  Operator
+  currentInputNWatermark
   
-The last watermark this operator has received in its first input (in 
milliseconds).
-Note: Only for operators with 2 inputs.
-  
-  Gauge
-
-
-  currentInput2Watermark
-  
-The last watermark this operator has received in its second input (in 
milliseconds).
-Note: Only for operators with 2 inputs.
+The last watermark this operator has received in its 
N'th input (in milliseconds), with index N 
starting from 1. For example currentInput1Watermark, 
currentInput2Watermark, ...
+Note: Only for operators with 2 or more inputs.
   
   Gauge
 
diff --git a/docs/monitoring/metrics.zh.md b/docs/monitoring/metrics.zh.md
index fe65d25..b2ecbcd 100644
--- a/docs/monitoring/metrics.zh.md
+++ b/docs/monitoring/metrics.zh.md
@@ -1470,19 +1470,11 @@ Certain RocksDB native metrics are available but 
disabled by default, you can fi
   Gauge
 
 
-  Operator
-  currentInput1Watermark
+  Operator
+  currentInputNWatermark
   
-The last watermark this operator has received in its first input (in 
milliseconds).
-Note: Only for operators with 2 inputs.
-  
-  Gauge
-
-
-  currentInput2Watermark
-  
-The last watermark this operator has received in its second input (in 
milliseconds).
-Note: Only for operators with 2 inputs.
+The last watermark this operator has received in its 
N'th input (in milliseconds), with index N 
starting from 1. For example currentInput1Watermark, 
currentInput2Watermark, ...
+Note: Only for operators with 2 or more inputs.
   
   Gauge
 



[flink] branch release-1.11 updated: [hotfix][docs] Update currentInputNWatermark metrics docs including N-ary operator

2020-07-30 Thread pnowojski
This is an automated email from the ASF dual-hosted git repository.

pnowojski pushed a commit to branch release-1.11
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.11 by this push:
 new ae041d9  [hotfix][docs] Update currentInputNWatermark metrics docs 
including N-ary operator
ae041d9 is described below

commit ae041d9cfa02065a083978d34fb464181b1cb1c2
Author: Piotr Nowojski 
AuthorDate: Wed Jul 29 15:52:25 2020 +0200

[hotfix][docs] Update currentInputNWatermark metrics docs including N-ary 
operator
---
 docs/monitoring/metrics.md| 16 
 docs/monitoring/metrics.zh.md | 16 
 2 files changed, 8 insertions(+), 24 deletions(-)

diff --git a/docs/monitoring/metrics.md b/docs/monitoring/metrics.md
index ebb3aa0..882a52c 100644
--- a/docs/monitoring/metrics.md
+++ b/docs/monitoring/metrics.md
@@ -1470,19 +1470,11 @@ Certain RocksDB native metrics are available but 
disabled by default, you can fi
   Gauge
 
 
-  Operator
-  currentInput1Watermark
+  Operator
+  currentInputNWatermark
   
-The last watermark this operator has received in its first input (in 
milliseconds).
-Note: Only for operators with 2 inputs.
-  
-  Gauge
-
-
-  currentInput2Watermark
-  
-The last watermark this operator has received in its second input (in 
milliseconds).
-Note: Only for operators with 2 inputs.
+The last watermark this operator has received in its 
N'th input (in milliseconds), with index N 
starting from 1. For example currentInput1Watermark, 
currentInput2Watermark, ...
+Note: Only for operators with 2 or more inputs.
   
   Gauge
 
diff --git a/docs/monitoring/metrics.zh.md b/docs/monitoring/metrics.zh.md
index fe65d25..b2ecbcd 100644
--- a/docs/monitoring/metrics.zh.md
+++ b/docs/monitoring/metrics.zh.md
@@ -1470,19 +1470,11 @@ Certain RocksDB native metrics are available but 
disabled by default, you can fi
   Gauge
 
 
-  Operator
-  currentInput1Watermark
+  Operator
+  currentInputNWatermark
   
-The last watermark this operator has received in its first input (in 
milliseconds).
-Note: Only for operators with 2 inputs.
-  
-  Gauge
-
-
-  currentInput2Watermark
-  
-The last watermark this operator has received in its second input (in 
milliseconds).
-Note: Only for operators with 2 inputs.
+The last watermark this operator has received in its 
N'th input (in milliseconds), with index N 
starting from 1. For example currentInput1Watermark, 
currentInput2Watermark, ...
+Note: Only for operators with 2 or more inputs.
   
   Gauge
 



[flink] branch master updated (875e95f -> 73a3111)

2020-07-30 Thread xtsong
This is an automated email from the ASF dual-hosted git repository.

xtsong pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 875e95f  [FLINK-18493] Make Yarn staging directory for Flink 
application configurable
 add 73a3111  [FLINK-18625][runtime] Maintain redundant taskmanagers to 
speed up failover

No new revisions were added by this update.

Summary of changes:
 .../generated/resource_manager_configuration.html  |   6 +
 .../configuration/ResourceManagerOptions.java  |  13 ++
 .../slotmanager/SlotManagerConfiguration.java  |  15 ++-
 .../slotmanager/SlotManagerImpl.java   |  72 +++---
 .../resourcemanager/ResourceManagerHATest.java |   3 +-
 .../slotmanager/SlotManagerBuilder.java|  10 +-
 .../slotmanager/SlotManagerImplTest.java   |   4 +
 ...java => TaskManagerCheckInSlotManagerTest.java} | 146 -
 8 files changed, 245 insertions(+), 24 deletions(-)
 rename 
flink-runtime/src/test/java/org/apache/flink/runtime/resourcemanager/slotmanager/{TaskManagerReleaseInSlotManagerTest.java
 => TaskManagerCheckInSlotManagerTest.java} (56%)



[flink] branch master updated (875e95f -> 73a3111)

2020-07-30 Thread xtsong
This is an automated email from the ASF dual-hosted git repository.

xtsong pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 875e95f  [FLINK-18493] Make Yarn staging directory for Flink 
application configurable
 add 73a3111  [FLINK-18625][runtime] Maintain redundant taskmanagers to 
speed up failover

No new revisions were added by this update.

Summary of changes:
 .../generated/resource_manager_configuration.html  |   6 +
 .../configuration/ResourceManagerOptions.java  |  13 ++
 .../slotmanager/SlotManagerConfiguration.java  |  15 ++-
 .../slotmanager/SlotManagerImpl.java   |  72 +++---
 .../resourcemanager/ResourceManagerHATest.java |   3 +-
 .../slotmanager/SlotManagerBuilder.java|  10 +-
 .../slotmanager/SlotManagerImplTest.java   |   4 +
 ...java => TaskManagerCheckInSlotManagerTest.java} | 146 -
 8 files changed, 245 insertions(+), 24 deletions(-)
 rename 
flink-runtime/src/test/java/org/apache/flink/runtime/resourcemanager/slotmanager/{TaskManagerReleaseInSlotManagerTest.java
 => TaskManagerCheckInSlotManagerTest.java} (56%)



[flink] branch master updated (bdaf1db -> 875e95f)

2020-07-30 Thread trohrmann
This is an automated email from the ASF dual-hosted git repository.

trohrmann pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from bdaf1db  [FLINK-15728][jdbc] Introduce FieldNamedPreparedStatement to 
support fields are bound multiple times in update statement
 add 875e95f  [FLINK-18493] Make Yarn staging directory for Flink 
application configurable

No new revisions were added by this update.

Summary of changes:
 docs/_includes/generated/yarn_config_configuration.html|  6 ++
 .../java/org/apache/flink/yarn/YarnClusterDescriptor.java  | 14 --
 .../apache/flink/yarn/configuration/YarnConfigOptions.java |  6 ++
 3 files changed, 24 insertions(+), 2 deletions(-)



[flink] branch master updated (bdaf1db -> 875e95f)

2020-07-30 Thread trohrmann
This is an automated email from the ASF dual-hosted git repository.

trohrmann pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from bdaf1db  [FLINK-15728][jdbc] Introduce FieldNamedPreparedStatement to 
support fields are bound multiple times in update statement
 add 875e95f  [FLINK-18493] Make Yarn staging directory for Flink 
application configurable

No new revisions were added by this update.

Summary of changes:
 docs/_includes/generated/yarn_config_configuration.html|  6 ++
 .../java/org/apache/flink/yarn/YarnClusterDescriptor.java  | 14 --
 .../apache/flink/yarn/configuration/YarnConfigOptions.java |  6 ++
 3 files changed, 24 insertions(+), 2 deletions(-)



[flink] branch master updated (9f406a6 -> bdaf1db)

2020-07-30 Thread jark
This is an automated email from the ASF dual-hosted git repository.

jark pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 9f406a6  [FLINK-13872][docs-zh] Translate Operations Playground to 
Chinese
 add f86cb1a  [FLINK-18579][jdbc] Remove deprecated classes in 
flink-connector-jdbc
 add bdaf1db  [FLINK-15728][jdbc] Introduce FieldNamedPreparedStatement to 
support fields are bound multiple times in update statement

No new revisions were added by this update.

Summary of changes:
 .../flink/api/java/io/jdbc/JDBCInputFormat.java| 202 ---
 .../flink/api/java/io/jdbc/JDBCOutputFormat.java   | 226 -
 .../jdbc/split/GenericParameterValuesProvider.java |  43 ---
 .../split/NumericBetweenParametersProvider.java|  65 
 .../io/jdbc/split/ParameterValuesProvider.java |  34 --
 .../flink/connector/jdbc/dialect/JdbcDialect.java  |  14 +-
 .../jdbc/internal/JdbcBatchingOutputFormat.java|   7 +-
 .../jdbc/internal/TableJdbcUpsertOutputFormat.java |  18 +-
 .../converter/AbstractJdbcRowConverter.java|  33 +-
 .../jdbc/internal/converter/JdbcRowConverter.java  |   4 +-
 .../executor/InsertOrUpdateJdbcExecutor.java   |   3 +
 ...va => TableBufferReducedStatementExecutor.java} |  14 +-
 ...or.java => TableBufferedStatementExecutor.java} |  55 ++-
 .../TableInsertOrUpdateStatementExecutor.java  | 115 +++
 ...utor.java => TableSimpleStatementExecutor.java} |  54 ++-
 .../statement/FieldNamedPreparedStatement.java | 264 +++
 .../statement/FieldNamedPreparedStatementImpl.java | 240 +
 .../connector/jdbc/statement/StatementFactory.java |  16 +-
 .../jdbc/table/JdbcDynamicOutputFormatBuilder.java | 155 +
 .../connector/jdbc/table/JdbcLookupFunction.java   |   9 +-
 .../jdbc/table/JdbcRowDataLookupFunction.java  |   8 +-
 .../connector/jdbc/table/JdbcTableSource.java  |   9 +-
 .../api/java/io/jdbc/JDBCInputFormatTest.java  | 372 -
 .../api/java/io/jdbc/JDBCOutputFormatTest.java | 259 --
 .../FieldNamedPreparedStatementImplTest.java   | 174 ++
 .../jdbc/table/JdbcDynamicOutputFormatTest.java|   8 +-
 .../jdbc/table/JdbcDynamicTableSinkITCase.java |   6 +-
 .../jdbc/table/JdbcLookupTableITCase.java  |  10 +-
 28 files changed, 1018 insertions(+), 1399 deletions(-)
 delete mode 100644 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JDBCInputFormat.java
 delete mode 100644 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JDBCOutputFormat.java
 delete mode 100644 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/split/GenericParameterValuesProvider.java
 delete mode 100644 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/split/NumericBetweenParametersProvider.java
 delete mode 100644 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/split/ParameterValuesProvider.java
 rename 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/internal/executor/{BufferReduceStatementExecutor.java
 => TableBufferReducedStatementExecutor.java} (89%)
 copy 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/internal/executor/{SimpleBatchStatementExecutor.java
 => TableBufferedStatementExecutor.java} (51%)
 create mode 100644 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/internal/executor/TableInsertOrUpdateStatementExecutor.java
 copy 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/internal/executor/{KeyedBatchStatementExecutor.java
 => TableSimpleStatementExecutor.java} (51%)
 create mode 100644 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/statement/FieldNamedPreparedStatement.java
 create mode 100644 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/statement/FieldNamedPreparedStatementImpl.java
 copy 
flink-clients/src/main/java/org/apache/flink/client/program/ClusterClientProvider.java
 => 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/statement/StatementFactory.java
 (66%)
 delete mode 100644 
flink-connectors/flink-connector-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/JDBCInputFormatTest.java
 delete mode 100644 
flink-connectors/flink-connector-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/JDBCOutputFormatTest.java
 create mode 100644 
flink-connectors/flink-connector-jdbc/src/test/java/org/apache/flink/connector/jdbc/statement/FieldNamedPreparedStatementImplTest.java



[flink] branch master updated (9f406a6 -> bdaf1db)

2020-07-30 Thread jark
This is an automated email from the ASF dual-hosted git repository.

jark pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 9f406a6  [FLINK-13872][docs-zh] Translate Operations Playground to 
Chinese
 add f86cb1a  [FLINK-18579][jdbc] Remove deprecated classes in 
flink-connector-jdbc
 add bdaf1db  [FLINK-15728][jdbc] Introduce FieldNamedPreparedStatement to 
support fields are bound multiple times in update statement

No new revisions were added by this update.

Summary of changes:
 .../flink/api/java/io/jdbc/JDBCInputFormat.java| 202 ---
 .../flink/api/java/io/jdbc/JDBCOutputFormat.java   | 226 -
 .../jdbc/split/GenericParameterValuesProvider.java |  43 ---
 .../split/NumericBetweenParametersProvider.java|  65 
 .../io/jdbc/split/ParameterValuesProvider.java |  34 --
 .../flink/connector/jdbc/dialect/JdbcDialect.java  |  14 +-
 .../jdbc/internal/JdbcBatchingOutputFormat.java|   7 +-
 .../jdbc/internal/TableJdbcUpsertOutputFormat.java |  18 +-
 .../converter/AbstractJdbcRowConverter.java|  33 +-
 .../jdbc/internal/converter/JdbcRowConverter.java  |   4 +-
 .../executor/InsertOrUpdateJdbcExecutor.java   |   3 +
 ...va => TableBufferReducedStatementExecutor.java} |  14 +-
 ...or.java => TableBufferedStatementExecutor.java} |  55 ++-
 .../TableInsertOrUpdateStatementExecutor.java  | 115 +++
 ...utor.java => TableSimpleStatementExecutor.java} |  54 ++-
 .../statement/FieldNamedPreparedStatement.java | 264 +++
 .../statement/FieldNamedPreparedStatementImpl.java | 240 +
 .../connector/jdbc/statement/StatementFactory.java |  16 +-
 .../jdbc/table/JdbcDynamicOutputFormatBuilder.java | 155 +
 .../connector/jdbc/table/JdbcLookupFunction.java   |   9 +-
 .../jdbc/table/JdbcRowDataLookupFunction.java  |   8 +-
 .../connector/jdbc/table/JdbcTableSource.java  |   9 +-
 .../api/java/io/jdbc/JDBCInputFormatTest.java  | 372 -
 .../api/java/io/jdbc/JDBCOutputFormatTest.java | 259 --
 .../FieldNamedPreparedStatementImplTest.java   | 174 ++
 .../jdbc/table/JdbcDynamicOutputFormatTest.java|   8 +-
 .../jdbc/table/JdbcDynamicTableSinkITCase.java |   6 +-
 .../jdbc/table/JdbcLookupTableITCase.java  |  10 +-
 28 files changed, 1018 insertions(+), 1399 deletions(-)
 delete mode 100644 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JDBCInputFormat.java
 delete mode 100644 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/JDBCOutputFormat.java
 delete mode 100644 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/split/GenericParameterValuesProvider.java
 delete mode 100644 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/split/NumericBetweenParametersProvider.java
 delete mode 100644 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/split/ParameterValuesProvider.java
 rename 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/internal/executor/{BufferReduceStatementExecutor.java
 => TableBufferReducedStatementExecutor.java} (89%)
 copy 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/internal/executor/{SimpleBatchStatementExecutor.java
 => TableBufferedStatementExecutor.java} (51%)
 create mode 100644 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/internal/executor/TableInsertOrUpdateStatementExecutor.java
 copy 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/internal/executor/{KeyedBatchStatementExecutor.java
 => TableSimpleStatementExecutor.java} (51%)
 create mode 100644 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/statement/FieldNamedPreparedStatement.java
 create mode 100644 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/statement/FieldNamedPreparedStatementImpl.java
 copy 
flink-clients/src/main/java/org/apache/flink/client/program/ClusterClientProvider.java
 => 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/statement/StatementFactory.java
 (66%)
 delete mode 100644 
flink-connectors/flink-connector-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/JDBCInputFormatTest.java
 delete mode 100644 
flink-connectors/flink-connector-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/JDBCOutputFormatTest.java
 create mode 100644 
flink-connectors/flink-connector-jdbc/src/test/java/org/apache/flink/connector/jdbc/statement/FieldNamedPreparedStatementImplTest.java