[flink] branch master updated (712207a -> 4fcd877)

2020-02-04 Thread kurt
This is an automated email from the ASF dual-hosted git repository.

kurt pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 712207a  [hotfix][docs] Fix missing double quotes in catalog docs
 add 4fcd877  [FLINK-15858][hive] Store generic table schema as properties

No new revisions were added by this update.

Summary of changes:
 docs/dev/table/hive/hive_catalog.md|  11 +-
 docs/dev/table/hive/hive_catalog.zh.md |  11 +-
 .../flink/table/catalog/hive/HiveCatalog.java  | 124 -
 .../table/catalog/hive/HiveCatalogConfig.java  |   3 +
 .../hive/HiveCatalogGenericMetadataTest.java   |  43 +++
 .../table/catalog/hive/HiveCatalogITCase.java  |  36 ++
 .../src/test/resources/csv/test3.csv   |   5 +
 .../apache/flink/table/catalog/CatalogTest.java|  36 +++---
 8 files changed, 190 insertions(+), 79 deletions(-)
 create mode 100644 
flink-connectors/flink-connector-hive/src/test/resources/csv/test3.csv



svn commit: r37874 - in /dev/flink/flink-1.9.2-rc0: ./ apache-flink-1.9.2.tar.gz apache-flink-1.9.2.tar.gz.asc apache-flink-1.9.2.tar.gz.sha512

2020-02-04 Thread jincheng
Author: jincheng
Date: Wed Feb  5 05:51:18 2020
New Revision: 37874

Log:
Add PyFlink 1.9.2

Added:
dev/flink/flink-1.9.2-rc0/
dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz   (with props)
dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz.asc
dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz.sha512

Added: dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz
==
Binary file - no diff available.

Propchange: dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz.asc
==
--- dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz.asc (added)
+++ dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz.asc Wed Feb  5 05:51:18 
2020
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIzBAABCAAdFiEEj+oe6dAEjAzMcLdXMhGwcDt56g4FAl45LogACgkQMhGwcDt5
+6g6SAxAAwyxmwGc5b1NTJuTGDWdsfU0RGca14MMievG+a06hzWuZlaZc75bYiWOP
+NzQNmnMrfg2ZhXMAffQK0ipEM/YrZSBdom4ES9ZZc6kIRwwzALTJ34kkmBNyZ6W1
+S2t1Gu0vgzBtZFMm1EqyIjJxIehzGwsS20Gn+sSFoN/OgcXdxBxfrTE7IMeWRxPy
+sbdFB4TbAwyGTPTG5RlGO1OcNQ66ANBuzL+20ajDylbjYIIUB+CrLs0+cWh0NV1H
+0hYSmv/wZITga+8iYzj3OfRoqR0xF/fn9TGvn0Z4GQXETrKnPusjpbjH+0RRcrgd
+qfq6GEyNx2wPBIzC+IwAil+juZlolqp4gt5qwsoknY++J6XCIkvd7CxWzD9t9rQj
+p9rdYpnaTkPpy0GZjOeOZ/azBRaRFLL7pC5+oQF8Hq4EH1ECkFQT3W7GuTCRiCco
+zBjg3FyVJGk1ugv1yt+R7viUv8ezcAmWOXQuRJZft3Tk6yAzt+Yd0EPGt6hXF1dA
+BsM0+aBJJafNpQPIapj+i1GdrPX2mb8Q2mqPKckSWM6qaYO3tnW2nSIELBb3lcqj
+GXqXncXNZrEJ+VoqZqrbPvZwsO4V8GJDN4BpCkUxHEouuR8xldS7B3c35YzePvpn
+JQGx4t8B2hGJ38HasXW0xPjqrwytca/Fa9D8g/7nJ6yv4YTzw6E=
+=zu1Q
+-END PGP SIGNATURE-

Added: dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz.sha512
==
--- dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz.sha512 (added)
+++ dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz.sha512 Wed Feb  5 
05:51:18 2020
@@ -0,0 +1 @@
+d4d5bb3d4b4ce4bfb910eb5f8622322e23f416ad13140f29e29a550d7ab780c01bf0c6d8d26c31476b0c5566d40437dfe2015d5b9a94def217e518ab5e73f366
  apache-flink-1.9.2.tar.gz




[flink] branch release-1.10 updated: [hotfix][docs] Fix missing double quotes in catalog docs

2020-02-04 Thread bli
This is an automated email from the ASF dual-hosted git repository.

bli pushed a commit to branch release-1.10
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.10 by this push:
 new 92d130c  [hotfix][docs] Fix missing double quotes in catalog docs
92d130c is described below

commit 92d130c7cad8e76d6cf77906394bdba1a48434dd
Author: Tartarus 
AuthorDate: Tue Feb 4 12:01:08 2020 +0800

[hotfix][docs] Fix missing double quotes in catalog docs

closes #11008
---
 docs/dev/table/catalogs.md| 2 +-
 docs/dev/table/catalogs.zh.md | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/dev/table/catalogs.md b/docs/dev/table/catalogs.md
index 6fb2b47..876e808 100644
--- a/docs/dev/table/catalogs.md
+++ b/docs/dev/table/catalogs.md
@@ -134,7 +134,7 @@ catalog.createTable(
 )
 );
 
-List tables = catalog.listTables("mydb); // tables should contain 
"mytable"
+List tables = catalog.listTables("mydb"); // tables should contain 
"mytable"
 {% endhighlight %}
 
 
diff --git a/docs/dev/table/catalogs.zh.md b/docs/dev/table/catalogs.zh.md
index e07e840..1814f5c 100644
--- a/docs/dev/table/catalogs.zh.md
+++ b/docs/dev/table/catalogs.zh.md
@@ -134,7 +134,7 @@ catalog.createTable(
 )
 );
 
-List tables = catalog.listTables("mydb); // tables should contain 
"mytable"
+List tables = catalog.listTables("mydb"); // tables should contain 
"mytable"
 {% endhighlight %}
 
 



[flink] branch release-1.10 updated: [FLINK-15614][docs] Consolidate Hadoop documentation

2020-02-04 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch release-1.10
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.10 by this push:
 new f26e0a9  [FLINK-15614][docs] Consolidate Hadoop documentation
f26e0a9 is described below

commit f26e0a9bf0538dbf1644118bfffae5ec8c352ca9
Author: Chesnay Schepler 
AuthorDate: Thu Jan 23 14:38:42 2020 +0100

[FLINK-15614][docs] Consolidate Hadoop documentation
---
 docs/flinkDev/building.md| 37 +--
 docs/flinkDev/building.zh.md | 37 +--
 docs/ops/deployment/hadoop.md| 42 +++-
 docs/ops/deployment/hadoop.zh.md | 42 +++-
 4 files changed, 76 insertions(+), 82 deletions(-)

diff --git a/docs/flinkDev/building.md b/docs/flinkDev/building.md
index bf4ebe6..c024e63 100644
--- a/docs/flinkDev/building.md
+++ b/docs/flinkDev/building.md
@@ -97,42 +97,7 @@ mvn clean install
 
 ## Hadoop Versions
 
-Flink has optional dependencies to HDFS and YARN which are both dependencies 
from [Apache Hadoop](http://hadoop.apache.org). There exist many different 
versions of Hadoop (from both the upstream project and the different Hadoop 
distributions). If you are using an incompatible combination of versions, 
exceptions may occur.
-
-Flink can be built against any Hadoop version >= 2.4.0, but depending on the 
version it may be a 1 or 2 step process.
-
-### Pre-bundled versions
-
-To build against Hadoop 2.4.1, 2.6.5, 2.7.5 or 2.8.3, it is sufficient to run 
(e.g., for version `2.6.5`):
-
-{% highlight bash %}
-mvn clean install -DskipTests -Dhadoop.version=2.6.5
-{% endhighlight %}
-
-To package a shaded pre-packaged Hadoop jar into the distributions `/lib` 
directory, activate the `include-hadoop` profile:
-
-{% highlight bash %}
-mvn clean install -DskipTests -Pinclude-hadoop
-{% endhighlight %}
-
-### Custom / Vendor-specific versions
-
-If you want to build against Hadoop version that is *NOT* 2.4.1, 2.6.5, 2.7.5 
or 2.8.3,
-then it is first necessary to build 
[flink-shaded](https://github.com/apache/flink-shaded) against this version.
-You can find the source for this project in the [Additional Components]({{ 
site.download_url }}#additional-components) section of the download page.
-
-Note If you want to build `flink-shaded` 
against a vendor specific Hadoop version, you first have to configure the
-vendor-specific maven repository in your local maven setup as described 
[here](https://maven.apache.org/guides/mini/guide-multiple-repositories.html).
-
-Run the following command to build and install `flink-shaded` against your 
desired Hadoop version (e.g., for version `2.6.5-custom`):
-
-{% highlight bash %}
-mvn clean install -Dhadoop.version=2.6.5-custom
-{% endhighlight %}
-
-After this step is complete, follow the steps for [Pre-bundled 
versions](#pre-bundled-versions).
-
-{% top %}
+Please see the [Hadoop integration section]({{ site.baseurl 
}}/ops/deployment/hadoop.html) on how to handle Hadoop classes and versions.
 
 ## Scala Versions
 
diff --git a/docs/flinkDev/building.zh.md b/docs/flinkDev/building.zh.md
index 181f37c..756d484 100644
--- a/docs/flinkDev/building.zh.md
+++ b/docs/flinkDev/building.zh.md
@@ -97,42 +97,7 @@ mvn clean install
 
 ## Hadoop Versions
 
-Flink has optional dependencies to HDFS and YARN which are both dependencies 
from [Apache Hadoop](http://hadoop.apache.org). There exist many different 
versions of Hadoop (from both the upstream project and the different Hadoop 
distributions). If you are using an incompatible combination of versions, 
exceptions may occur.
-
-Flink can be built against any Hadoop version >= 2.4.0, but depending on the 
version it may be a 1 or 2 step process.
-
-### Pre-bundled versions
-
-To build against Hadoop 2.4.1, 2.6.5, 2.7.5 or 2.8.3, it is sufficient to run 
(e.g., for version `2.6.5`):
-
-{% highlight bash %}
-mvn clean install -DskipTests -Dhadoop.version=2.6.5
-{% endhighlight %}
-
-To package a shaded pre-packaged Hadoop jar into the distributions `/lib` 
directory, activate the `include-hadoop` profile:
-
-{% highlight bash %}
-mvn clean install -DskipTests -Pinclude-hadoop
-{% endhighlight %}
-
-### Custom / Vendor-specific versions
-
-If you want to build against Hadoop version that is *NOT* 2.4.1, 2.6.5, 2.7.5 
or 2.8.3,
-then it is first necessary to build 
[flink-shaded](https://github.com/apache/flink-shaded) against this version.
-You can find the source for this project in the [Additional Components]({{ 
site.download_url }}#additional-components) section of the download page.
-
-Note If you want to build `flink-shaded` 
against a vendor specific Hadoop version, you first have to configure the
-vendor-specific maven repository in your local maven setup as described 
[here](https://maven.apache.org/guides/mini/guide-multiple-rep

[flink] branch master updated (2ae305c -> d2de15c)

2020-02-04 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 2ae305c  [FLINK-15807][docs] Add Java 11 to supported JDKs
 add d2de15c  [FLINK-15614][docs] Consolidate Hadoop documentation

No new revisions were added by this update.

Summary of changes:
 docs/flinkDev/building.md| 37 +--
 docs/flinkDev/building.zh.md | 37 +--
 docs/ops/deployment/hadoop.md| 42 +++-
 docs/ops/deployment/hadoop.zh.md | 42 +++-
 4 files changed, 76 insertions(+), 82 deletions(-)



[flink] branch release-1.10 updated: [FLINK-15807][docs] Add Java 11 to supported JDKs

2020-02-04 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch release-1.10
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.10 by this push:
 new 635519c  [FLINK-15807][docs] Add Java 11 to supported JDKs
635519c is described below

commit 635519c24ddef3da83fdca6e6245867e52dd0ed7
Author: Chesnay Schepler 
AuthorDate: Thu Jan 30 12:30:09 2020 +0100

[FLINK-15807][docs] Add Java 11 to supported JDKs
---
 docs/getting-started/walkthroughs/datastream_api.md| 2 +-
 docs/getting-started/walkthroughs/datastream_api.zh.md | 2 +-
 docs/getting-started/walkthroughs/table_api.md | 2 +-
 docs/getting-started/walkthroughs/table_api.zh.md  | 2 +-
 4 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/docs/getting-started/walkthroughs/datastream_api.md 
b/docs/getting-started/walkthroughs/datastream_api.md
index 8676ae4..b33937a 100644
--- a/docs/getting-started/walkthroughs/datastream_api.md
+++ b/docs/getting-started/walkthroughs/datastream_api.md
@@ -54,7 +54,7 @@ In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/commu
 
 If you want to follow along, you will require a computer with:
 
-* Java 8 
+* Java 8 or 11
 * Maven 
 
 A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly, so you only need to focus on filling out the 
business logic.
diff --git a/docs/getting-started/walkthroughs/datastream_api.zh.md 
b/docs/getting-started/walkthroughs/datastream_api.zh.md
index a2660a9..5819768 100644
--- a/docs/getting-started/walkthroughs/datastream_api.zh.md
+++ b/docs/getting-started/walkthroughs/datastream_api.zh.md
@@ -54,7 +54,7 @@ Flink 支持对状态和时间的细粒度控制,以此来实现复杂的事
 
 首先,你需要在你的电脑上准备以下环境:
 
-* Java 8
+* Java 8 or 11
 * Maven
 
 一个准备好的 Flink Maven Archetype 能够快速创建一个包含了必要依赖的 Flink 
程序骨架,基于此,你可以把精力集中在编写业务逻辑上即可。
diff --git a/docs/getting-started/walkthroughs/table_api.md 
b/docs/getting-started/walkthroughs/table_api.md
index c8dbca6..f4a507c 100644
--- a/docs/getting-started/walkthroughs/table_api.md
+++ b/docs/getting-started/walkthroughs/table_api.md
@@ -49,7 +49,7 @@ In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/commu
 
 If you want to follow along, you will require a computer with: 
 
-* Java 8 
+* Java 8 or 11
 * Maven 
 
 A provided Flink Maven Archetype will create a skeleton project with all the 
necessary dependencies quickly:
diff --git a/docs/getting-started/walkthroughs/table_api.zh.md 
b/docs/getting-started/walkthroughs/table_api.zh.md
index b782226..c3995ae 100644
--- a/docs/getting-started/walkthroughs/table_api.zh.md
+++ b/docs/getting-started/walkthroughs/table_api.zh.md
@@ -48,7 +48,7 @@ Flink 中的 Table API 通常用于简化数据分析,数据流水线和 ETL 
 ## 如何跟进
 
 如果想要继续,你的电脑需要安装:
-* Java 8 
+* Java 8 or 11
 * Maven 
 
 现成的 Flink Maven Archetype 可以快速创建一个具有所有必要依赖的框架项目:



[flink] branch master updated (eea3d6f -> 2ae305c)

2020-02-04 Thread chesnay
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from eea3d6f  [FLINK-15864][k8s] Upgrade jackson-databind dependency to 
2.10.1
 add 2ae305c  [FLINK-15807][docs] Add Java 11 to supported JDKs

No new revisions were added by this update.

Summary of changes:
 docs/getting-started/walkthroughs/datastream_api.md| 2 +-
 docs/getting-started/walkthroughs/datastream_api.zh.md | 2 +-
 docs/getting-started/walkthroughs/table_api.md | 2 +-
 docs/getting-started/walkthroughs/table_api.zh.md  | 2 +-
 docs/ops/deployment/local.md   | 2 +-
 docs/ops/deployment/local.zh.md| 2 +-
 6 files changed, 6 insertions(+), 6 deletions(-)



[flink] branch release-1.10 updated: [FLINK-15864][k8s] Upgrade jackson-databind dependency to 2.10.1

2020-02-04 Thread trohrmann
This is an automated email from the ASF dual-hosted git repository.

trohrmann pushed a commit to branch release-1.10
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.10 by this push:
 new 5fa6289  [FLINK-15864][k8s] Upgrade jackson-databind dependency to 
2.10.1
5fa6289 is described below

commit 5fa62896fde2683aff1bd89573260ccf72c54d0d
Author: Till Rohrmann 
AuthorDate: Mon Feb 3 15:02:13 2020 +0100

[FLINK-15864][k8s] Upgrade jackson-databind dependency to 2.10.1

This closes #11000.
---
 flink-kubernetes/pom.xml   | 52 +-
 .../src/main/resources/META-INF/NOTICE |  2 +-
 2 files changed, 12 insertions(+), 42 deletions(-)

diff --git a/flink-kubernetes/pom.xml b/flink-kubernetes/pom.xml
index dfc4cb5..8fabddf 100644
--- a/flink-kubernetes/pom.xml
+++ b/flink-kubernetes/pom.xml
@@ -35,6 +35,17 @@ under the License.
4.5.2

 
+   
+   
+   
+   
+   com.squareup.okhttp3
+   okhttp
+   3.12.1
+   
+   
+   
+

 

@@ -57,54 +68,13 @@ under the License.
io.fabric8
kubernetes-client
${kubernetes.client.version}
-   
-   
-   
-   com.squareup.okhttp3
-   okhttp
-   
-   
-   
com.fasterxml.jackson.core
-   jackson-core
-   
-   
-   
com.fasterxml.jackson.core
-   
jackson-databind
-   
-   
-   
-
-   
-   com.fasterxml.jackson.core
-   jackson-databind
-   2.9.8
-   
-   
-   com.squareup.okhttp3
-   okhttp
-   3.12.1

 

-

io.fabric8
kubernetes-server-mock
${kubernetes.client.version}
-   
-   
-   com.squareup.okhttp3
-   okhttp
-   
-   
-   
com.fasterxml.jackson.core
-   jackson-core
-   
-   
-   
com.fasterxml.jackson.core
-   
jackson-databind
-   
-   
test

 
diff --git a/flink-kubernetes/src/main/resources/META-INF/NOTICE 
b/flink-kubernetes/src/main/resources/META-INF/NOTICE
index 47837c2..9502c67 100644
--- a/flink-kubernetes/src/main/resources/META-INF/NOTICE
+++ b/flink-kubernetes/src/main/resources/META-INF/NOTICE
@@ -8,7 +8,7 @@ This project bundles the following dependencies under the 
Apache Software Licens
 
 - com.fasterxml.jackson.core:jackson-annotations:2.10.1
 - com.fasterxml.jackson.core:jackson-core:2.10.1
-- com.fasterxml.jackson.core:jackson-databind:2.9.8
+- com.fasterxml.jackson.core:jackson-databind:2.10.1
 - com.fasterxml.jackson.dataformat:jackson-dataformat-yaml:2.9.9
 - com.github.mifmif:generex:1.0.2
 - com.squareup.okhttp3:logging-interceptor:3.12.0



[flink] branch master updated (de7440e -> 3d51e6c3d)

2020-02-04 Thread srichter
This is an automated email from the ASF dual-hosted git repository.

srichter pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from de7440e  [FLINK-15658][table-planner-blink] Fix duplicate field names 
exception when join on the same key multiple times (#11011)
 add 3d51e6c3d [hotfix] [javadoc] RocksFullSnapshotStrategy remark 
DESCRIPTION correctly

No new revisions were added by this update.

Summary of changes:
 .../contrib/streaming/state/snapshot/RocksFullSnapshotStrategy.java | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[flink-docker] branch master updated: Minor updates to account for repo change

2020-02-04 Thread uce
This is an automated email from the ASF dual-hosted git repository.

uce pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-docker.git


The following commit(s) were added to refs/heads/master by this push:
 new 775fa1f  Minor updates to account for repo change
775fa1f is described below

commit 775fa1fa05cce5d7ad926d2a5240b3a71df407e9
Author: Patrick Lucas 
AuthorDate: Tue Feb 4 13:33:57 2020 +0100

Minor updates to account for repo change
---
 README.md | 4 ++--
 generate-stackbrew-library.sh | 4 ++--
 2 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/README.md b/README.md
index f3b4937..156843c 100644
--- a/README.md
+++ b/README.md
@@ -1,7 +1,7 @@
-docker-flink
+flink-docker
 
 
-[![Build 
Status](https://travis-ci.org/docker-flink/docker-flink.svg?branch=master)](https://travis-ci.org/docker-flink/docker-flink)
+[![Build 
Status](https://travis-ci.org/apache/flink-docker.svg?branch=master)](https://travis-ci.org/apache/flink-docker)
 
 Docker packaging for Apache Flink
 
diff --git a/generate-stackbrew-library.sh b/generate-stackbrew-library.sh
index 906466d..de5bad1 100755
--- a/generate-stackbrew-library.sh
+++ b/generate-stackbrew-library.sh
@@ -58,11 +58,11 @@ getArches() {
 getArches 'flink'
 
 cat <<-EOH
-# this file is generated via 
https://github.com/docker-flink/docker-flink/blob/$(fileCommit "$self")/$self
+# this file is generated via 
https://github.com/apache/flink-docker/blob/$(fileCommit "$self")/$self
 
 Maintainers: Patrick Lucas  (@patricklucas),
  Ismaël Mejía  (@iemejia)
-GitRepo: https://github.com/docker-flink/docker-flink.git
+GitRepo: https://github.com/apache/flink-docker.git
 EOH
 
 # prints "$2$1$3$1...$N"



[flink-docker] branch master updated: Trigger CI

2020-02-04 Thread uce
This is an automated email from the ASF dual-hosted git repository.

uce pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-docker.git


The following commit(s) were added to refs/heads/master by this push:
 new 042a2e2  Trigger CI
042a2e2 is described below

commit 042a2e27a9b468ccc5b184f8383a8e7d80f859ba
Author: Ufuk Celebi 
AuthorDate: Tue Feb 4 13:28:30 2020 +0100

Trigger CI

An empty commit to check whether the Travis integration works as expected.



[flink] branch release-1.10 updated: [FLINK-15658][table-planner-blink] Fix duplicate field names exception when join on the same key multiple times (#11011)

2020-02-04 Thread jark
This is an automated email from the ASF dual-hosted git repository.

jark pushed a commit to branch release-1.10
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.10 by this push:
 new c84b754  [FLINK-15658][table-planner-blink] Fix duplicate field names 
exception when join on the same key multiple times (#11011)
c84b754 is described below

commit c84b754b60e62f106adda47e91bfeec5ae5edeb5
Author: Jark Wu 
AuthorDate: Tue Feb 4 16:39:01 2020 +0800

[FLINK-15658][table-planner-blink] Fix duplicate field names exception when 
join on the same key multiple times (#11011)
---
 .../flink/table/planner/plan/utils/KeySelectorUtil.java  | 12 ++--
 .../flink/table/planner/runtime/stream/sql/JoinITCase.scala  | 12 
 2 files changed, 18 insertions(+), 6 deletions(-)

diff --git 
a/flink-table/flink-table-planner-blink/src/main/java/org/apache/flink/table/planner/plan/utils/KeySelectorUtil.java
 
b/flink-table/flink-table-planner-blink/src/main/java/org/apache/flink/table/planner/plan/utils/KeySelectorUtil.java
index 5911abe..76934bf 100644
--- 
a/flink-table/flink-table-planner-blink/src/main/java/org/apache/flink/table/planner/plan/utils/KeySelectorUtil.java
+++ 
b/flink-table/flink-table-planner-blink/src/main/java/org/apache/flink/table/planner/plan/utils/KeySelectorUtil.java
@@ -44,20 +44,20 @@ public class KeySelectorUtil {
public static BaseRowKeySelector getBaseRowSelector(int[] keyFields, 
BaseRowTypeInfo rowType) {
if (keyFields.length > 0) {
LogicalType[] inputFieldTypes = 
rowType.getLogicalTypes();
-   String[] inputFieldNames = rowType.getFieldNames();
LogicalType[] keyFieldTypes = new 
LogicalType[keyFields.length];
-   String[] keyFieldNames = new String[keyFields.length];
for (int i = 0; i < keyFields.length; ++i) {
keyFieldTypes[i] = 
inputFieldTypes[keyFields[i]];
-   keyFieldNames[i] = 
inputFieldNames[keyFields[i]];
}
-   RowType returnType = RowType.of(keyFieldTypes, 
keyFieldNames);
-   RowType inputType = RowType.of(inputFieldTypes, 
rowType.getFieldNames());
+   // do not provide field names for the result key type,
+   // because we may have duplicate key fields and the 
field names may conflict
+   RowType returnType = RowType.of(keyFieldTypes);
+   RowType inputType = rowType.toRowType();
GeneratedProjection generatedProjection = 
ProjectionCodeGenerator.generateProjection(
CodeGeneratorContext.apply(new TableConfig()),
"KeyProjection",
inputType,
-   returnType, keyFields);
+   returnType,
+   keyFields);
BaseRowTypeInfo keyRowType = 
BaseRowTypeInfo.of(returnType);
return new BinaryRowKeySelector(keyRowType, 
generatedProjection);
} else {
diff --git 
a/flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/runtime/stream/sql/JoinITCase.scala
 
b/flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/runtime/stream/sql/JoinITCase.scala
index f2a3131..36e8a45 100644
--- 
a/flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/runtime/stream/sql/JoinITCase.scala
+++ 
b/flink-table/flink-table-planner-blink/src/test/scala/org/apache/flink/table/planner/runtime/stream/sql/JoinITCase.scala
@@ -232,6 +232,18 @@ class JoinITCase(state: StateBackendMode) extends 
StreamingWithStateTestBase(sta
   }
 
   @Test
+  def testInnerJoinWithDuplicateKey(): Unit = {
+val query = "SELECT a1, b1, b3 FROM A JOIN B ON a1 = b1 AND a1 = b3"
+
+val sink = new TestingRetractSink
+tEnv.sqlQuery(query).toRetractStream[Row].addSink(sink).setParallelism(1)
+env.execute()
+
+val expected = Seq("2,2,2", "3,3,3")
+assertEquals(expected.sorted, sink.getRetractResults.sorted)
+  }
+
+  @Test
   def testInnerJoinWithNonEquiJoinPredicate(): Unit = {
 val sqlQuery = "SELECT c, g FROM Table3, Table5 WHERE b = e AND a < 6 AND 
h < b"
 



[flink] branch release-1.9 updated (0bd64a5 -> fe15e28)

2020-02-04 Thread hequn
This is an automated email from the ASF dual-hosted git repository.

hequn pushed a change to branch release-1.9
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 0bd64a5  [hotfix][doc] Fix version settings when building PyFlink
 add fe15e28  [FLINK-15638][release][python] Change version of pyflink to 
the release version when creating release branch

No new revisions were added by this update.

Summary of changes:
 tools/releasing/create_release_branch.sh | 5 +
 1 file changed, 5 insertions(+)



[flink] branch master updated (5281212 -> de7440e)

2020-02-04 Thread jark
This is an automated email from the ASF dual-hosted git repository.

jark pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 5281212  [FLINK-15494][table-planner-blink] Fix incorrect time field 
index in LogicalWindowAggregateRuleBase
 add de7440e  [FLINK-15658][table-planner-blink] Fix duplicate field names 
exception when join on the same key multiple times (#11011)

No new revisions were added by this update.

Summary of changes:
 .../flink/table/planner/plan/utils/KeySelectorUtil.java  | 12 ++--
 .../flink/table/planner/runtime/stream/sql/JoinITCase.scala  | 12 
 2 files changed, 18 insertions(+), 6 deletions(-)



[flink] branch release-1.9 updated (b8221b0 -> 0bd64a5)

2020-02-04 Thread hequn
This is an automated email from the ASF dual-hosted git repository.

hequn pushed a change to branch release-1.9
in repository https://gitbox.apache.org/repos/asf/flink.git.


from b8221b0  [FLINK-15010][network] Add shut down hook to ensure cleanup 
netty shuffle directories
 add 0bd64a5  [hotfix][doc] Fix version settings when building PyFlink

No new revisions were added by this update.

Summary of changes:
 docs/flinkDev/building.md| 6 +-
 docs/flinkDev/building.zh.md | 6 +-
 2 files changed, 10 insertions(+), 2 deletions(-)



[flink] branch release-1.10 updated: [FLINK-15494][table-planner-blink] Fix incorrect time field index in LogicalWindowAggregateRuleBase

2020-02-04 Thread jark
This is an automated email from the ASF dual-hosted git repository.

jark pushed a commit to branch release-1.10
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.10 by this push:
 new 2f6d138  [FLINK-15494][table-planner-blink] Fix incorrect time field 
index in LogicalWindowAggregateRuleBase
2f6d138 is described below

commit 2f6d13818baf6e783eb2028fda131b89ea823cc2
Author: Benchao Li 
AuthorDate: Tue Feb 4 13:29:52 2020 +0800

[FLINK-15494][table-planner-blink] Fix incorrect time field index in 
LogicalWindowAggregateRuleBase

This closes #10784
---
 .../logical/BatchLogicalWindowAggregateRule.scala  | 32 ++
 .../logical/LogicalWindowAggregateRuleBase.scala   |  6 ++--
 .../logical/StreamLogicalWindowAggregateRule.scala | 32 +-
 .../batch/sql/agg/WindowAggregateITCase.scala  | 23 
 .../runtime/stream/sql/WindowAggregateITCase.scala | 29 
 5 files changed, 77 insertions(+), 45 deletions(-)

diff --git 
a/flink-table/flink-table-planner-blink/src/main/scala/org/apache/flink/table/planner/plan/rules/logical/BatchLogicalWindowAggregateRule.scala
 
b/flink-table/flink-table-planner-blink/src/main/scala/org/apache/flink/table/planner/plan/rules/logical/BatchLogicalWindowAggregateRule.scala
index 35dcf31..50099b0 100644
--- 
a/flink-table/flink-table-planner-blink/src/main/scala/org/apache/flink/table/planner/plan/rules/logical/BatchLogicalWindowAggregateRule.scala
+++ 
b/flink-table/flink-table-planner-blink/src/main/scala/org/apache/flink/table/planner/plan/rules/logical/BatchLogicalWindowAggregateRule.scala
@@ -27,7 +27,6 @@ import 
org.apache.flink.table.runtime.types.LogicalTypeDataTypeConverter.fromLog
 import org.apache.calcite.rel.`type`.RelDataType
 import org.apache.calcite.rel.logical.{LogicalAggregate, LogicalProject}
 import org.apache.calcite.rex._
-import org.apache.calcite.sql.SqlKind
 
 import _root_.java.math.{BigDecimal => JBigDecimal}
 
@@ -56,33 +55,20 @@ class BatchLogicalWindowAggregateRule
 
   private[table] override def getTimeFieldReference(
   operand: RexNode,
-  windowExprIdx: Int,
+  timeAttributeIndex: Int,
   rowType: RelDataType): FieldReferenceExpression = {
 if (FlinkTypeFactory.isProctimeIndicatorType(operand.getType)) {
   throw new ValidationException("Window can not be defined over "
 + "a proctime attribute column for batch mode")
 }
-operand match {
-  case c: RexCall if c.getKind == SqlKind.CAST =>
-getTimeFieldReference(c.getOperands.get(0), windowExprIdx, rowType)
-  // match TUMBLE_ROWTIME and TUMBLE_PROCTIME
-  case c: RexCall if c.getOperands.size() == 1 &&
-FlinkTypeFactory.isTimeIndicatorType(c.getType) =>
-new FieldReferenceExpression(
-  rowType.getFieldList.get(windowExprIdx).getName,
-  fromLogicalTypeToDataType(toLogicalType(c.getType)),
-  0, // only one input, should always be 0
-  windowExprIdx)
-  case ref: RexInputRef =>
-// resolve field name of window attribute
-val fieldName = rowType.getFieldList.get(ref.getIndex).getName
-val fieldType = rowType.getFieldList.get(ref.getIndex).getType
-new FieldReferenceExpression(
-  fieldName,
-  fromLogicalTypeToDataType(toLogicalType(fieldType)),
-  0, // only one input, should always be 0
-  windowExprIdx)
-}
+
+val fieldName = rowType.getFieldList.get(timeAttributeIndex).getName
+val fieldType = rowType.getFieldList.get(timeAttributeIndex).getType
+new FieldReferenceExpression(
+  fieldName,
+  fromLogicalTypeToDataType(toLogicalType(fieldType)),
+  0,
+  timeAttributeIndex)
   }
 
   def getOperandAsLong(call: RexCall, idx: Int): Long =
diff --git 
a/flink-table/flink-table-planner-blink/src/main/scala/org/apache/flink/table/planner/plan/rules/logical/LogicalWindowAggregateRuleBase.scala
 
b/flink-table/flink-table-planner-blink/src/main/scala/org/apache/flink/table/planner/plan/rules/logical/LogicalWindowAggregateRuleBase.scala
index f9260af..2a5d6a4 100644
--- 
a/flink-table/flink-table-planner-blink/src/main/scala/org/apache/flink/table/planner/plan/rules/logical/LogicalWindowAggregateRuleBase.scala
+++ 
b/flink-table/flink-table-planner-blink/src/main/scala/org/apache/flink/table/planner/plan/rules/logical/LogicalWindowAggregateRuleBase.scala
@@ -40,10 +40,8 @@ import org.apache.calcite.rex._
 import org.apache.calcite.sql.`type`.SqlTypeUtil
 import org.apache.calcite.tools.RelBuilder
 import org.apache.calcite.util.ImmutableBitSet
-import org.apache.calcite.util.{Pair => CPair}
 
 import _root_.scala.collection.JavaConversions._
-import _root_.scala.collection.mutable.ArrayBuffer
 
 /**
   * Planner rule that transforms simple [[LogicalAggregate]] on a 
[[LogicalProject]]
@@ -75,7 +73,6 @@ abstract class LogicalWindowAggregateRuleBase