[flink] branch master updated (bc55577 -> 7f6ce78)

2020-09-09 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from bc55577  [FLINK-19094][docs] Revise the description of watermark 
strategy in Flink Table document
 add 7f6ce78  [FLINK-19163][python][build system] Add building py38 wheel 
package of PyFlink in Azure CI (#13362)

No new revisions were added by this update.

Summary of changes:
 flink-python/dev/build-wheels.sh | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[flink] branch master updated (bc55577 -> 7f6ce78)

2020-09-09 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from bc55577  [FLINK-19094][docs] Revise the description of watermark 
strategy in Flink Table document
 add 7f6ce78  [FLINK-19163][python][build system] Add building py38 wheel 
package of PyFlink in Azure CI (#13362)

No new revisions were added by this update.

Summary of changes:
 flink-python/dev/build-wheels.sh | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[flink] branch master updated: [FLINK-19118][python] Support Expression in the operations of Python Table API (#13304)

2020-09-03 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new a8cc62a  [FLINK-19118][python] Support Expression in the operations of 
Python Table API (#13304)
a8cc62a is described below

commit a8cc62a901dabe6c4d877b97db6024715b68174a
Author: Dian Fu 
AuthorDate: Fri Sep 4 09:15:58 2020 +0800

[FLINK-19118][python] Support Expression in the operations of Python Table 
API (#13304)
---
 .../pyflink/table/examples/batch/word_count.py |   7 +-
 flink-python/pyflink/table/table.py| 341 +
 flink-python/pyflink/table/table_environment.py|  74 ++---
 flink-python/pyflink/table/tests/test_aggregate.py |   2 +-
 flink-python/pyflink/table/tests/test_calc.py  |   9 +-
 .../pyflink/table/tests/test_column_operation.py   |   8 +-
 flink-python/pyflink/table/tests/test_correlate.py |   9 +-
 .../pyflink/table/tests/test_dependency.py |  27 +-
 flink-python/pyflink/table/tests/test_explain.py   |   3 +-
 flink-python/pyflink/table/tests/test_join.py  |  12 +-
 .../pyflink/table/tests/test_pandas_conversion.py  |   6 +-
 .../pyflink/table/tests/test_pandas_udf.py |   5 +-
 .../pyflink/table/tests/test_schema_operation.py   |   4 +-
 flink-python/pyflink/table/tests/test_sort.py  |   2 +-
 .../table/tests/test_table_environment_api.py  |   4 +-
 flink-python/pyflink/table/tests/test_window.py|  21 +-
 flink-python/pyflink/table/utils.py|  11 +-
 flink-python/pyflink/table/window.py   | 108 ---
 18 files changed, 380 insertions(+), 273 deletions(-)

diff --git a/flink-python/pyflink/table/examples/batch/word_count.py 
b/flink-python/pyflink/table/examples/batch/word_count.py
index 1a5bf2c..3f317af 100644
--- a/flink-python/pyflink/table/examples/batch/word_count.py
+++ b/flink-python/pyflink/table/examples/batch/word_count.py
@@ -23,6 +23,7 @@ import tempfile
 
 from pyflink.dataset import ExecutionEnvironment
 from pyflink.table import BatchTableEnvironment, TableConfig
+from pyflink.table import expressions as expr
 
 
 def word_count():
@@ -65,9 +66,9 @@ def word_count():
 t_env.execute_sql(sink_ddl)
 
 elements = [(word, 1) for word in content.split(" ")]
-t_env.from_elements(elements, ["word", "count"]) \
- .group_by("word") \
- .select("word, count(1) as count") \
+table = t_env.from_elements(elements, ["word", "count"])
+table.group_by(table.word) \
+ .select(table.word, expr.lit(1).count.alias('count')) \
  .insert_into("Results")
 
 t_env.execute("word_count")
diff --git a/flink-python/pyflink/table/table.py 
b/flink-python/pyflink/table/table.py
index 2369aaf..10f3a3d 100644
--- a/flink-python/pyflink/table/table.py
+++ b/flink-python/pyflink/table/table.py
@@ -19,13 +19,18 @@
 import warnings
 
 from py4j.java_gateway import get_method
+from typing import Union
 
 from pyflink.java_gateway import get_gateway
+from pyflink.table import ExplainDetail
+from pyflink.table.expression import Expression, _get_java_expression
+from pyflink.table.expressions import col
 from pyflink.table.serializers import ArrowSerializer
 from pyflink.table.table_result import TableResult
 from pyflink.table.table_schema import TableSchema
 from pyflink.table.types import create_arrow_schema
-from pyflink.table.utils import tz_convert_from_internal
+from pyflink.table.utils import tz_convert_from_internal, to_expression_jarray
+from pyflink.table.window import OverWindow, GroupWindow
 
 from pyflink.util.utils import to_jarray
 from pyflink.util.utils import to_j_explain_detail_arr
@@ -66,7 +71,25 @@ class Table(object):
 self._j_table = j_table
 self._t_env = t_env
 
-def select(self, fields):
+def __str__(self):
+return self._j_table.toString()
+
+def __getattr__(self, name) -> Expression:
+"""
+Returns the :class:`Expression` of the column `name`.
+
+Example:
+::
+
+>>> tab.select(tab.a)
+"""
+if name not in self.get_schema().get_field_names():
+raise AttributeError(
+"The current table has no column named '%s', available 
columns: [%s]"
+% (name, ', '.join(self.get_schema().get_field_names(
+return col(name)
+
+def select(self, *fields: Union[str, Expression]):
 """
 Performs a selection operation. Similar to a SQL SELECT statement. The 
field expressions
 can contain complex expressions.
@@ -74,29 +97,35 @@ class Table(object):
 Example:
 ::
 
+>>> from pyflink.table import express

[flink] branch master updated: [FLINK-19118][python] Support Expression in the operations of Python Table API (#13304)

2020-09-03 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new a8cc62a  [FLINK-19118][python] Support Expression in the operations of 
Python Table API (#13304)
a8cc62a is described below

commit a8cc62a901dabe6c4d877b97db6024715b68174a
Author: Dian Fu 
AuthorDate: Fri Sep 4 09:15:58 2020 +0800

[FLINK-19118][python] Support Expression in the operations of Python Table 
API (#13304)
---
 .../pyflink/table/examples/batch/word_count.py |   7 +-
 flink-python/pyflink/table/table.py| 341 +
 flink-python/pyflink/table/table_environment.py|  74 ++---
 flink-python/pyflink/table/tests/test_aggregate.py |   2 +-
 flink-python/pyflink/table/tests/test_calc.py  |   9 +-
 .../pyflink/table/tests/test_column_operation.py   |   8 +-
 flink-python/pyflink/table/tests/test_correlate.py |   9 +-
 .../pyflink/table/tests/test_dependency.py |  27 +-
 flink-python/pyflink/table/tests/test_explain.py   |   3 +-
 flink-python/pyflink/table/tests/test_join.py  |  12 +-
 .../pyflink/table/tests/test_pandas_conversion.py  |   6 +-
 .../pyflink/table/tests/test_pandas_udf.py |   5 +-
 .../pyflink/table/tests/test_schema_operation.py   |   4 +-
 flink-python/pyflink/table/tests/test_sort.py  |   2 +-
 .../table/tests/test_table_environment_api.py  |   4 +-
 flink-python/pyflink/table/tests/test_window.py|  21 +-
 flink-python/pyflink/table/utils.py|  11 +-
 flink-python/pyflink/table/window.py   | 108 ---
 18 files changed, 380 insertions(+), 273 deletions(-)

diff --git a/flink-python/pyflink/table/examples/batch/word_count.py 
b/flink-python/pyflink/table/examples/batch/word_count.py
index 1a5bf2c..3f317af 100644
--- a/flink-python/pyflink/table/examples/batch/word_count.py
+++ b/flink-python/pyflink/table/examples/batch/word_count.py
@@ -23,6 +23,7 @@ import tempfile
 
 from pyflink.dataset import ExecutionEnvironment
 from pyflink.table import BatchTableEnvironment, TableConfig
+from pyflink.table import expressions as expr
 
 
 def word_count():
@@ -65,9 +66,9 @@ def word_count():
 t_env.execute_sql(sink_ddl)
 
 elements = [(word, 1) for word in content.split(" ")]
-t_env.from_elements(elements, ["word", "count"]) \
- .group_by("word") \
- .select("word, count(1) as count") \
+table = t_env.from_elements(elements, ["word", "count"])
+table.group_by(table.word) \
+ .select(table.word, expr.lit(1).count.alias('count')) \
  .insert_into("Results")
 
 t_env.execute("word_count")
diff --git a/flink-python/pyflink/table/table.py 
b/flink-python/pyflink/table/table.py
index 2369aaf..10f3a3d 100644
--- a/flink-python/pyflink/table/table.py
+++ b/flink-python/pyflink/table/table.py
@@ -19,13 +19,18 @@
 import warnings
 
 from py4j.java_gateway import get_method
+from typing import Union
 
 from pyflink.java_gateway import get_gateway
+from pyflink.table import ExplainDetail
+from pyflink.table.expression import Expression, _get_java_expression
+from pyflink.table.expressions import col
 from pyflink.table.serializers import ArrowSerializer
 from pyflink.table.table_result import TableResult
 from pyflink.table.table_schema import TableSchema
 from pyflink.table.types import create_arrow_schema
-from pyflink.table.utils import tz_convert_from_internal
+from pyflink.table.utils import tz_convert_from_internal, to_expression_jarray
+from pyflink.table.window import OverWindow, GroupWindow
 
 from pyflink.util.utils import to_jarray
 from pyflink.util.utils import to_j_explain_detail_arr
@@ -66,7 +71,25 @@ class Table(object):
 self._j_table = j_table
 self._t_env = t_env
 
-def select(self, fields):
+def __str__(self):
+return self._j_table.toString()
+
+def __getattr__(self, name) -> Expression:
+"""
+Returns the :class:`Expression` of the column `name`.
+
+Example:
+::
+
+>>> tab.select(tab.a)
+"""
+if name not in self.get_schema().get_field_names():
+raise AttributeError(
+"The current table has no column named '%s', available 
columns: [%s]"
+% (name, ', '.join(self.get_schema().get_field_names(
+return col(name)
+
+def select(self, *fields: Union[str, Expression]):
 """
 Performs a selection operation. Similar to a SQL SELECT statement. The 
field expressions
 can contain complex expressions.
@@ -74,29 +97,35 @@ class Table(object):
 Example:
 ::
 
+>>> from pyflink.table import express

svn commit: r41116 - /release/flink/flink-1.10.1/

2020-08-24 Thread jincheng
Author: jincheng
Date: Tue Aug 25 05:14:30 2020
New Revision: 41116

Log:
Remove flink-1.10.1

Removed:
release/flink/flink-1.10.1/



[flink] branch release-1.11 updated: [FLINK-18816] [docs] Correct API change in pyflink dependency management page (#13062)

2020-08-04 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch release-1.11
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.11 by this push:
 new 043d93f  [FLINK-18816] [docs] Correct API change in pyflink dependency 
management page (#13062)
043d93f is described below

commit 043d93f9e2e8d5a86ac4ed9bd7c8c5c19f69d05c
Author: Zhenhua Yang 
AuthorDate: Wed Aug 5 10:43:08 2020 +0800

[FLINK-18816] [docs] Correct API change in pyflink dependency management 
page (#13062)
---
 docs/dev/table/python/dependency_management.md| 4 ++--
 docs/dev/table/python/dependency_management.zh.md | 4 ++--
 2 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/docs/dev/table/python/dependency_management.md 
b/docs/dev/table/python/dependency_management.md
index 0a68633..3dd6846 100644
--- a/docs/dev/table/python/dependency_management.md
+++ b/docs/dev/table/python/dependency_management.md
@@ -29,11 +29,11 @@ If third-party Java dependencies are used, you can specify 
the dependencies with
 {% highlight python %}
 # Specify a list of jar URLs via "pipeline.jars". The jars are separated by 
";" and will be uploaded to the cluster.
 # NOTE: Only local file URLs (start with "file://") are supported.
-table_env.get_config().set_configuration("pipeline.jars", 
"file:///my/jar/path/connector.jar;file:///my/jar/path/udf.jar")
+table_env.get_config().get_configuration().set_string("pipeline.jars", 
"file:///my/jar/path/connector.jar;file:///my/jar/path/udf.jar")
 
 # Specify a list of URLs via "pipeline.classpaths". The URLs are separated by 
";" and will be added to the classpath of the cluster.
 # NOTE: The Paths must specify a protocol (e.g. file://) and users should 
ensure that the URLs are accessible on both the client and the cluster.
-table_env.get_config().set_configuration("pipeline.classpaths", 
"file:///my/jar/path/connector.jar;file:///my/jar/path/udf.jar")
+table_env.get_config().get_configuration().set_string("pipeline.classpaths", 
"file:///my/jar/path/connector.jar;file:///my/jar/path/udf.jar")
 {% endhighlight %}
 
 # Python Dependency
diff --git a/docs/dev/table/python/dependency_management.zh.md 
b/docs/dev/table/python/dependency_management.zh.md
index edf39d5..82db82c 100644
--- a/docs/dev/table/python/dependency_management.zh.md
+++ b/docs/dev/table/python/dependency_management.zh.md
@@ -29,11 +29,11 @@ If third-party Java dependencies are used, you can specify 
the dependencies with
 {% highlight python %}
 # Specify a list of jar URLs via "pipeline.jars". The jars are separated by 
";" and will be uploaded to the cluster.
 # NOTE: Only local file URLs (start with "file://") are supported.
-table_env.get_config().set_configuration("pipeline.jars", 
"file:///my/jar/path/connector.jar;file:///my/jar/path/udf.jar")
+table_env.get_config().get_configuration().set_string("pipeline.jars", 
"file:///my/jar/path/connector.jar;file:///my/jar/path/udf.jar")
 
 # Specify a list of URLs via "pipeline.classpaths". The URLs are separated by 
";" and will be added to the classpath of the cluster.
 # NOTE: The Paths must specify a protocol (e.g. file://) and users should 
ensure that the URLs are accessible on both the client and the cluster.
-table_env.get_config().set_configuration("pipeline.classpaths", 
"file:///my/jar/path/connector.jar;file:///my/jar/path/udf.jar")
+table_env.get_config().get_configuration().set_string("pipeline.classpaths", 
"file:///my/jar/path/connector.jar;file:///my/jar/path/udf.jar")
 {% endhighlight %}
 
 # Python Dependency



[flink] branch master updated (cfda0e0 -> 456d5ba)

2020-08-04 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from cfda0e0  [FLINK-18690][runtime] Implement 
LocalInputPreferredSlotSharingStrategy
 add 456d5ba  [FLINK-18816] [docs] Correct API change in pyflink dependency 
management page (#13062)

No new revisions were added by this update.

Summary of changes:
 docs/dev/table/python/dependency_management.md| 4 ++--
 docs/dev/table/python/dependency_management.zh.md | 4 ++--
 2 files changed, 4 insertions(+), 4 deletions(-)



[flink] branch master updated (cfda0e0 -> 456d5ba)

2020-08-04 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from cfda0e0  [FLINK-18690][runtime] Implement 
LocalInputPreferredSlotSharingStrategy
 add 456d5ba  [FLINK-18816] [docs] Correct API change in pyflink dependency 
management page (#13062)

No new revisions were added by this update.

Summary of changes:
 docs/dev/table/python/dependency_management.md| 4 ++--
 docs/dev/table/python/dependency_management.zh.md | 4 ++--
 2 files changed, 4 insertions(+), 4 deletions(-)



svn commit: r39400 - /release/flink/flink-1.10.0/

2020-05-12 Thread jincheng
Author: jincheng
Date: Tue May 12 11:43:07 2020
New Revision: 39400

Log:
Remove old release flink 1.10.0

Removed:
release/flink/flink-1.10.0/



svn commit: r39396 - in /dev/flink: flink-1.10.1-rc1/ flink-1.10.1-rc2/

2020-05-12 Thread jincheng
Author: jincheng
Date: Tue May 12 10:05:28 2020
New Revision: 39396

Log:
Remove old release candidates for Apache Flink 1.10.1

Removed:
dev/flink/flink-1.10.1-rc1/
dev/flink/flink-1.10.1-rc2/



svn commit: r39395 - /dev/flink/flink-1.10.1-rc3/ /release/flink/flink-1.10.1/

2020-05-12 Thread jincheng
Author: jincheng
Date: Tue May 12 10:02:02 2020
New Revision: 39395

Log:
Release Flink 1.10.1

Added:
release/flink/flink-1.10.1/
  - copied from r39394, dev/flink/flink-1.10.1-rc3/
Removed:
dev/flink/flink-1.10.1-rc3/



[flink] branch master updated: [FLINK-17454][python] Specify a port number for gateway callback server from python gateway.

2020-05-11 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 5f744d3  [FLINK-17454][python] Specify a port number for gateway 
callback server from python gateway.
5f744d3 is described below

commit 5f744d3f81bcfb8f77164a5ec9caa4594851d4bf
Author: acqua.csq 
AuthorDate: Fri May 8 23:29:12 2020 +0800

[FLINK-17454][python] Specify a port number for gateway callback server 
from python gateway.

This closes #12061
---
 flink-python/pyflink/java_gateway.py   | 11 +++--
 .../apache/flink/client/python/PythonEnvUtils.java | 55 +-
 2 files changed, 49 insertions(+), 17 deletions(-)

diff --git a/flink-python/pyflink/java_gateway.py 
b/flink-python/pyflink/java_gateway.py
index d8e061b..33ab2ae 100644
--- a/flink-python/pyflink/java_gateway.py
+++ b/flink-python/pyflink/java_gateway.py
@@ -49,15 +49,19 @@ def get_gateway():
 # if Java Gateway is already running
 if 'PYFLINK_GATEWAY_PORT' in os.environ:
 gateway_port = int(os.environ['PYFLINK_GATEWAY_PORT'])
-callback_port = int(os.environ['PYFLINK_CALLBACK_PORT'])
 gateway_param = GatewayParameters(port=gateway_port, 
auto_convert=True)
 _gateway = JavaGateway(
 gateway_parameters=gateway_param,
 callback_server_parameters=CallbackServerParameters(
-port=callback_port, daemonize=True, 
daemonize_connections=True))
+port=0, daemonize=True, daemonize_connections=True))
 else:
 _gateway = launch_gateway()
 
+callback_server = _gateway.get_callback_server()
+callback_server_listening_address = 
callback_server.get_listening_address()
+callback_server_listening_port = 
callback_server.get_listening_port()
+
_gateway.jvm.org.apache.flink.client.python.PythonEnvUtils.resetCallbackClient(
+callback_server_listening_address, 
callback_server_listening_port)
 # import the flink view
 import_flink_view(_gateway)
 install_exception_handler()
@@ -102,7 +106,6 @@ def launch_gateway():
 
 with open(conn_info_file, "rb") as info:
 gateway_port = struct.unpack("!I", info.read(4))[0]
-callback_port = struct.unpack("!I", info.read(4))[0]
 finally:
 shutil.rmtree(conn_info_dir)
 
@@ -110,7 +113,7 @@ def launch_gateway():
 gateway = JavaGateway(
 gateway_parameters=GatewayParameters(port=gateway_port, 
auto_convert=True),
 callback_server_parameters=CallbackServerParameters(
-port=callback_port, daemonize=True, daemonize_connections=True))
+port=0, daemonize=True, daemonize_connections=True))
 
 return gateway
 
diff --git 
a/flink-python/src/main/java/org/apache/flink/client/python/PythonEnvUtils.java 
b/flink-python/src/main/java/org/apache/flink/client/python/PythonEnvUtils.java
index cf15b7b..76370dc 100644
--- 
a/flink-python/src/main/java/org/apache/flink/client/python/PythonEnvUtils.java
+++ 
b/flink-python/src/main/java/org/apache/flink/client/python/PythonEnvUtils.java
@@ -35,7 +35,10 @@ import py4j.GatewayServer;
 import java.io.File;
 import java.io.IOException;
 import java.lang.reflect.Field;
+import java.lang.reflect.InvocationTargetException;
 import java.lang.reflect.Method;
+import java.net.InetAddress;
+import java.net.UnknownHostException;
 import java.nio.file.FileSystems;
 import java.nio.file.FileVisitResult;
 import java.nio.file.Files;
@@ -277,16 +280,7 @@ final class PythonEnvUtils {
.gateway(new Gateway(new 
ConcurrentHashMap(), new CallbackClient(freePort)))
.javaPort(0)
.build();
-   CallbackClient callbackClient = 
(CallbackClient) server.getCallbackClient();
-   // The Java API of py4j does not provide 
approach to set "daemonize_connections" parameter.
-   // Use reflect to daemonize the connection 
thread.
-   Field executor = 
CallbackClient.class.getDeclaredField("executor");
-   executor.setAccessible(true);
-   ((ScheduledExecutorService) 
executor.get(callbackClient)).shutdown();
-   executor.set(callbackClient, 
Executors.newScheduledThreadPool(1, Thread::new));
-   Method setupCleaner = 
CallbackClient.class.getDeclaredMethod("setupCleaner");
-   setupCleaner.setAccessible(true);
- 

svn commit: r39003 - /release/flink/KEYS

2020-04-17 Thread jincheng
Author: jincheng
Date: Fri Apr 17 06:24:36 2020
New Revision: 39003

Log:
add key for dian

Modified:
release/flink/KEYS

Modified: release/flink/KEYS
==
--- release/flink/KEYS (original)
+++ release/flink/KEYS Fri Apr 17 06:24:36 2020
@@ -1823,3 +1823,64 @@ XsJeobCqzKNL8uowvoOcsfalfEn6PSCSO7tyUlUb
 0CWBS+8V
 =fg3B
 -END PGP PUBLIC KEY BLOCK-
+
+pub   rsa4096 2020-04-17 [SC]
+  6B6291A8502BA8F0913AE04DDEB95B05BF075300
+uid   [ultimate] Dian Fu (CODE SIGNING KEY) 
+sig 3DEB95B05BF075300 2020-04-17  Dian Fu (CODE SIGNING KEY) 

+sub   rsa4096 2020-04-17 [E]
+sig  DEB95B05BF075300 2020-04-17  Dian Fu (CODE SIGNING KEY) 

+
+-BEGIN PGP PUBLIC KEY BLOCK-
+
+mQINBF6ZQBoBEADAK64BtsMWxhrOxH6I5Caq9Nbq7SvTFJKszXLYrLC0Jt+ZkVHu
+69vvDLVFCIKT3hkM5lEgObYKwr/dniIWEevrGHUcGh3lOESZAHkvwpX06jpCNHoj
+qGYGlbZLSOg5c2HtSdAUoFuGFaw87KsiuLmhr+M+3VmRGN+aBytYHpUU+3hMtvi5
+besZ1XcFWHiVmZPM1TLuhVRV2UxM3pMcrowaP82b/ozSTpsiej12GKWJ33V6bkD6
+b6scFMaz6dwZfQEsbcXD0neRjovuVnUxdM4fyOgTw2Oql4Y/lFm+r4aHrE/cTe6V
+/JqWw9/gT1UaXVf9PnB1bysLPPdE5jaKddUM7IlsY9YMJntC+/IO23iGpeRwlikm
+FGj0Km66CYqR62bpynuipvSQhH8PYG3oCtvadeGpb0BUa/jbX2/TP5S22G70il8W
+FHP8YqISRYBxN7c39d0CbaBLK7hOYqkVRXjb8+WO5ad7D870XBh1cb5PNCG+9ojB
+NIXQ5wX0d9DOCwIblAVu5z4VNeTevUUf1RjHIjvSXvhSZaSnwndslFgKupzwc5lA
+Uxl58FQrUMOaWSDz2xYUUWu+rPgTVuEDTcuf2rYK+YCB2mYrCn2jonbFuqFobAU2
+lqdBbUGoatd5OBckS6UgfkHfmOj01v/DX2K46W/gMmEneoXjeMwpp5J1ewARAQAB
+tC5EaWFuIEZ1IChDT0RFIFNJR05JTkcgS0VZKSA8ZGlhbmZ1QGFwYWNoZS5vcmc+
+iQJOBBMBCAA4FiEEa2KRqFArqPCROuBN3rlbBb8HUwAFAl6ZQBoCGwMFCwkIBwIG
+FQoJCAsCBBYCAwECHgECF4AACgkQ3rlbBb8HUwB4nA/+Nmvfo70Aq6zl+lK9+6uM
+lvr73HWIslxT1euLxH+TEjz6qqIjQ4N8uuaHZk2qv9S19gGxqpwTLqomRcOY2VJr
+ihKMvtjvVKPNAUp7+ppOBQA2rp0gYXZocyxgjAff28y0kacQCf0paLcaSH+LWn+o
+3SdDzcc6I8kQdrrNOEwCgl3v83XBz14Qu0wC3cEvf6LWrNVIEWtIcVPlEDHuuokz
+kV2P088IsASE4iusF9Dlr08xE/gClt6J+Iz1glmtdcZqv6xMN3M/dmMqlF+hbdPB
+4rJdQZw6sR9YFlF5RZOeVVg6Q9lTNViaktaTHRBkuuSInVarXIZYFPEa3uD39TWj
+MmJzI4mZrod96ez01kqhJ+TL55Y6Of8AmdFH8FRE3F1EuwkoY4Kn9gGm2GiiqXic
+FPdd250UIKqQbSvA1bKVfy9F/iRx+1BzCnLRl7w2oKH0kFRvOMIYbr911PqxDDwr
+CDBBqbWppMQRZurT/TKnk4njA/NKUD874hxEmuku2Qo4qS3F316WRElR02g5Ykh5
+189fDaRujhzTdo+rQyGZi5oNdH5zZyJpm7AiD4QR+jFb+cNqEJPi+QhA8QA0iw8/
+kLd7ndie8X2LW+HmmZ20qh8YlMiS7TVjj5FzRMqxCbvBmQxFGnkhDGNL1Bv8J86O
+j/JRdkA0e0wa0MDRqlrCNOm5Ag0EXplAGgEQAKuy5arOsgpifz1oNAeVoz75+Ui+
+NNDtlE684nYwZkzb5+v1rE6tyVJDTrXzRiyqyX5IOM0buOy7tti5CeXKq87CG5Dl
+3q20F/aiauqwGflqlZlNT+4PZBWhEP0z5/INJxWG0JVHiiO837P+C6f4dv18yhFt
+l+OAVGMYikLFUy4MDwAj/yEn1QX6UbPY/1jH241pTCA/DJxwsCszXlJbOI6HUV0U
+ndfLu7aP6+r2SHmJJIAbln2NDKF6s0w0kJANGgFPjqZRg33b4fuqQoWbvZmp+K0D
+3UYAwW7K0CsvPFGOHlWYrSCoe++XmTjEpqd4tVGlQ5rMYLs5tupNbrsTBia0OGpf
+M+JsTUonsKZIKocK1Tbp9Kc+4eWn+ol8VVVjb3kRhAnH1uXW6OPJeU16ppHvOqGJ
+Km7IJbVdL761Lp6Xw2uZjt3A8QRr4xfHeir0IsLrUgkGxb6WdfUpVbJ7HW7EExdF
+ylaERQlxF+QAx64bbRl4e0rtYt6Lz0ZDfk/1IAGuJ1zZ6T/o6vL149m6RxtXTD1A
+T7gZ2xa7BfPP/gmE9U2YfDyCFXgGLgQJdhIIhS6Mfv+f28oSqFBlG23TwByGYtG4
+Y1yE92VmQl4EY65sYLNkgRSWdOyepI4bva3vvPUY+Qlf8w7xYelGwST/Un83MYYx
+zrj5AkLVUL4hmgunABEBAAGJAjYEGAEIACAWIQRrYpGoUCuo8JE64E3euVsFvwdT
+AAUCXplAGgIbDAAKCRDeuVsFvwdTACzgD/oDPTvfywe+4bFbo+Pv1CQvmNz4t3RT
+3KELhKmdXBMQ0mAwmOr/KZ4JYrb+2Tr4oHusGHqZymQYetYTLW5KgUOosMc927C7
+xLVax0jsxXWtSkARvZ3sXxb+Q2EBinxi4pB/TGq6MR95Qnk8H1GR/bGbVW90Cn5S
+hZqgEvDD2jsd/DXRhbXHNszjX6AT3lJGGq7sX8jBmmxdTBAq49vak7ap0umgJuWT
++WAZzB9Z142VmC43ejxwCwJaKfAquHjLt08df/YDKodNest9zCXgyYMMcZld4lqE
+DabGzMbGM3VcSjUmGbIT0gQJTsy6fpyYedlrThdI5CcqspHMv/2xIP6hjGv4SD2l
+usJ37MBr0uNK6kkd/AMV0HK/k1Q/oEZ/MVTa77g0Ydd6c46nWnQs7abtAaGwi8Oi
+cRcNm+dALVwUGfhZdW5W60SdSjTfEhv+wjPnLLOzsuO96LOreApjMfcc7gWpYjEC
+dy5VC0DncbshuYY88nVs9GAHvhDyMwqPYDlRym6QBd2AEx8GCrp9+ErM3CCbReWp
+TKM9ayMOQ+uYvkVdvPJu9InPBp4ziTur6EjrrNfbsABAS45Ez7BLANateymvR7bH
+5PS4wtWzWo48D1Ev6SxgoYuIAsWj1or0hsyhlvxQ99+lbmZFJzA2XRb7WkjA3Z7k
+kVyZNDCORta08w==
+=2FP5
+-END PGP PUBLIC KEY BLOCK-
+




[flink] branch 1.10 created (now 8a27bd9)

2020-03-20 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a change to branch 1.10
in repository https://gitbox.apache.org/repos/asf/flink.git.


  at 8a27bd9  [FLINK-16691][python][docs] Improve Python UDF documentation 
to remind users to install PyFlink on the cluster

No new revisions were added by this update.



svn commit: r38502 - /release/flink/KEYS

2020-03-13 Thread jincheng
Author: jincheng
Date: Fri Mar 13 12:55:26 2020
New Revision: 38502

Log:
Add KEY for Yu Li

Modified:
release/flink/KEYS

Modified: release/flink/KEYS
==
--- release/flink/KEYS (original)
+++ release/flink/KEYS Fri Mar 13 12:55:26 2020
@@ -1764,3 +1764,62 @@ bdBQn5N9SpDTqE4moVe2Pc9aWPPj4en+EEjgWoSC
 DTA=
 =o0P4
 -END PGP PUBLIC KEY BLOCK-
+pub   rsa4096 2019-08-29 [SC]
+  D8D3D42E84C753CA5F170BDF93C07902771AB743
+uid   [ultimate] Yu Li (CODE SIGNING KEY) 
+sig 393C07902771AB743 2019-08-29  Yu Li (CODE SIGNING KEY) 

+sub   rsa4096 2019-08-29 [E]
+sig  93C07902771AB743 2019-08-29  Yu Li (CODE SIGNING KEY) 

+
+-BEGIN PGP PUBLIC KEY BLOCK-
+
+mQINBF1nH5kBEACvfpza6VlMFn/mwLOD2mnxJWBxEatau3KKTuTpj4ygByvt2/AP
+5mwXkFzBBBiVvz0y+CYnj8jDcXhgV3Xe+e8Dpc5svV25ezsoAQq6BPXhJayzTFnG
+AtypvRCv9IwaePBRxmaP68M/RtgYlgIxwFMzF07ntMMsz9MXiBQJfpL4Am685Ywi
+iX2aAGnS+ikMFTI19ENMv32d+0Arg0Klfrmb2KMUkjFVEidc2noamIcZ8QlFwzAh
+/RT/7juGNSJ68rG6+9Ya5KmCJYY1BqOq2LQC1H7Ba3oHCyMYHfFl1PUi+S+CtiIJ
+aO59Zht8+kQ7X9W4UydGy/XegtT4ivktjXmhysQTYfaj9RVmQvXXG/OGIdyW+jQb
++3/ZWr4cQsye/jW9LHBWtEq0t/+xb+3WkjBAPWSAPYNEWhJN7yqJayGAhyPTNink
+tBp6UfOKnPDVu08R0SYHqBP+ks50lyWHBJDqhRwoeYRY+fs5dn9DmMuqWkS21w+L
+NkHKhCW+UNhxhzL96ASqAYJoc5Vn8wVwp7oPNi/3aWUdvup/Lq/g2utUq+zipRG/
+RB9/Dcg8m26hEnLidIOJ6Y8uFCjtwtcsXR0597UxvKYrsMItfaLZy9BNsLdejs+N
+S2AiaxFDm/Dm93xgPVyZhxH1O5eg7ev3X6Nf4ckXoubWsbaEJpNadmvoqwARAQAB
+tCpZdSBMaSAoQ09ERSBTSUdOSU5HIEtFWSkgPGxpeXVAYXBhY2hlLm9yZz6JAk4E
+EwEKADgWIQTY09QuhMdTyl8XC9+TwHkCdxq3QwUCXWcfmQIbAwULCQgHAwUVCgkI
+CwUWAgMBAAIeAQIXgAAKCRCTwHkCdxq3Q6ekD/9N88rU2lsZyABLDNGWWJf+qzPH
+kneujxY8HYIsxF5paHdHJbKKcLxMch26grhijLQtqlE2CcHANgiui+ispFoSTF0t
+jiDvofai+PFjF+o9SEMypKq/GstwpTQ1t3i0G8gpBROu+5g7regFmlGqhbYQG+Sf
+0pHtTYkm5AxM83AMGQLpavZWXjPegj7ANPLf/8SnfxeDCsTJlpnqsWqnigKoWQDh
+TH8WH9Nv881GqknKlt1Yb5aqjOwFn5MlKxjX7LszbGdyUqRpYCmtc/mxYPWgWDc6
+1CTF5eVTNQfuBSA/xMR/8EekMmOcCBeMTaeEGPU+cmo2TKnKhDsudOR/PDg2kXoM
+sj+b7GF07lvragv7psN7MScPnN4DJrWa7M8Uo41CB6Zz3eml+ngOC80pHSP+lGWv
+NGlaHHYS1vA/hW306aDA5dgAMcs84M3z0jJ1KmmU1sn3QwF+ukaf1G6pukQi0rQt
+pSftD9IIySeM7naT16WwVC1fFOtB4iwWUauvMNCjpf0kSN+gJTyFBysLCtltmpxI
+krllW/EJkKQzsIwZJ3AkFdVPgDe6eJD7NJ/pteNV2MmiPFEl9q1qJ/2gtdNcmKCQ
+v8THjtyAWRNnSsXiP18lOqzz4ftnAcsSpebb6z1JkWrkTwDDokuzlqkR3i85Yvg8
+s/Xng1VwGTJSZkUDdrkCDQRdZx+ZARAAwUuZMEz6cg9PSLfaYW3rqkm5U5jAvChO
+91KwIHbIsrN5wsR5BfZVsCTCR3fZJVxlE0H5Vk9IrhMYPQcC9hPdZthDIGJ1bsJu
+oT42RR9mmuETYd/HRvULu083m7YDkaNRhHcjoQShYy4NVC/ijTn22LdRval6drmz
+NAqtiFKsAvv7W0UfkBLOn3mTeGRd5J1llvb8Rve468iAecj4WxN11/J8eNtLYodn
+fLugjKKDonGCrAeSdNeINuM3TKDsaQtHTT7uIl288P83Z7Ldm5oq2wZbwh3FY7Q8
+YZRYpaDQWRxOul8SM4DMKcg/CHk5mH7p0N4P+LWkCbW3HbblDnzPkTW8MA3JVzOt
++YhP3AP6ugvPZvDxjt0gYdqAFpqKTnOG8hk4SzOkQZ3wKbn1CAy0WDBKqN+a6ZHc
+IPDCOgb2vsywT3Vww+dgKr3SctZKkD7fFzqo8ISXyuhqgDrSoTjH7EXpiVFkgXkS
+b2ZkW+twDZIz67FdIh+Mdsn0U7frjSocYQEUOM8CRxBC+N7oU2qjgu23fC96jcUf
+XPzd6gzvlkkL43506YlJcIh9/CpMNxJwZOzO1yPHF3ovz8XAmkQ28yIdMFt9YP/m
+G0s+cc+N8Em/nLyJessdAhuAzm8SQRddtsAXKxWiaH+ZvXtp32kzQkdWUeFMY0AG
+KDNUIqVweg0AEQEAAYkCNgQYAQoAIBYhBNjT1C6Ex1PKXxcL35PAeQJ3GrdDBQJd
+Zx+ZAhsMAAoJEJPAeQJ3GrdDSB0QAJGdwyTkqsyypKrs6lJ7TQMMVzUEGmfYZbIQ
+D3oVLfs19+hWtX3b1Gu+TKCb/v3EJ46VAG/1vjUb3l4GQzlJkRSWoYkhAe8RRLXR
+YHRo0renr3MS0xRvtqhYloRwtnL0rQRREnrdlVQMR0kJeDDcIUC54AuPXyFTmi+C
+oTbyYjoPTnqK0QM89WeMdr0WAkXlBNr5Er5USiyJn7wi0qwvjn+a8gOZ35pMUmMH
+o3AenLTgPkKu12P+mHfYfPKqfn3adurZ4uesdWm2b7gDdsTz3XmM49QITXEVnk61
+wTDYHLAOiK0px3dlcXfnR6uO9qGeTfr9qHj92uKaDGjidUL50dkV53WnYIrkFpMK
+P9KzSTMyOTAs2Hk9iz9QefGIH8HpBgQVRPE06i5uk3lWuSoP9YhJulwp3mJG4o/H
+Vlr/Y7ypALd8bZ+kMtWFzQRNrQfp0MDn6QbPNvJ4y2Q90MCB6Q34KkbBY5O4Gxj1
+PFVcnH+6DTs/i8TVho4VUd72CMSzZ3sfc8XssMU9G2iyhKotWMDi6+4k69joUiPQ
+BizgLBrKDr+0OQDcdcuXme2s16fhx7Ly+WEVtzi502nEsb1vYcyj4HJnWupYkDlJ
+XsJeobCqzKNL8uowvoOcsfalfEn6PSCSO7tyUlUbA0s//IPt9/BhmLS/pXVgToSs
+0CWBS+8V
+=fg3B
+-END PGP PUBLIC KEY BLOCK-




svn commit: r38026 - /dev/flink/flink-1.9.2-rc1/

2020-02-12 Thread jincheng
Author: jincheng
Date: Thu Feb 13 02:26:33 2020
New Revision: 38026

Log:
Remove old release candidates for Apache PyFlink 1.9.2

Removed:
dev/flink/flink-1.9.2-rc1/



svn commit: r38025 - in /release/flink/flink-1.9.2: apache-flink-1.9.2.tar.gz apache-flink-1.9.2.tar.gz.asc apache-flink-1.9.2.tar.gz.sha512

2020-02-12 Thread jincheng
Author: jincheng
Date: Thu Feb 13 02:21:38 2020
New Revision: 38025

Log:
Release PyFlink 1.9.2

Added:
release/flink/flink-1.9.2/apache-flink-1.9.2.tar.gz   (with props)
release/flink/flink-1.9.2/apache-flink-1.9.2.tar.gz.asc
release/flink/flink-1.9.2/apache-flink-1.9.2.tar.gz.sha512

Added: release/flink/flink-1.9.2/apache-flink-1.9.2.tar.gz
==
Binary file - no diff available.

Propchange: release/flink/flink-1.9.2/apache-flink-1.9.2.tar.gz
--
svn:mime-type = application/octet-stream

Added: release/flink/flink-1.9.2/apache-flink-1.9.2.tar.gz.asc
==
--- release/flink/flink-1.9.2/apache-flink-1.9.2.tar.gz.asc (added)
+++ release/flink/flink-1.9.2/apache-flink-1.9.2.tar.gz.asc Thu Feb 13 02:21:38 
2020
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIzBAABCAAdFiEEj+oe6dAEjAzMcLdXMhGwcDt56g4FAl49JVMACgkQMhGwcDt5
+6g7mWw//UkNQH7s2cT+1feNdc2JRKHg0zC3SdHiHbuOrdaUngUFjeXbYwUo28/q6
+GztQK4bE+m/D/lFlR0mHKSkGZG4hq1pL0NMYvbDlvkxB8RzNRO4zOQL/NVv7UhJU
+/iJRe0KoHAxceNLqtrmPepXP8IU9IepciJto4IR6ORdAuBdMb10WqFQfn0R4NPyf
+igR0OqMQDvm36aXTKDvDmWK6AXqzhTM0jpyXvUxuh0UF7lrkJez+l6JX6DhXGs7Q
+eUVZT2sBxeZ0cqL/tocs4gYy+JxBrd8eZPv60R9oQYSonVBnoHENlRjl6qbPgY7A
+14VD0uCAwRboEggODNDezdVKoby38IjsrvsVQ8RwnwBHEfl1TJt8pM7NCLcylZUV
+kd0hjjPlAllckmWKpXsAAS7xgy0abl2qWvbJm15HgGya4CWbtKC+7upHxNIn7HHn
+ZhUIukf3/AI6sKR1aQE7O9vnZvb5rblNk0Ljoy7sug1VD3F8IpBoUGDC0srckFUb
+6jPrtj4XqMXRwtr9Ka0t0K+akaQv1kX6BTkxq2bbOoxIciXnUiIVwKLPsZrtPZnY
+yqoxw9hrU6cSXoP3jy7j5/y3luVBxxfpGkEZEi77D74rL8fI0GEKQce04b/+B1iw
+mhtFqxnD2K75s/i6ZfI0AhaKOjrNoPegpzip5tHMScCgQfRefFs=
+=fegp
+-END PGP SIGNATURE-

Added: release/flink/flink-1.9.2/apache-flink-1.9.2.tar.gz.sha512
==
--- release/flink/flink-1.9.2/apache-flink-1.9.2.tar.gz.sha512 (added)
+++ release/flink/flink-1.9.2/apache-flink-1.9.2.tar.gz.sha512 Thu Feb 13 
02:21:38 2020
@@ -0,0 +1 @@
+e0e7c2b770b61264b72f0a3de744ded4734460fe47d2e5baa68d679d9acaa23f92c28d2f50c567c7026259d52dd0255f536ade7598915def0be24279e0f265f8
  apache-flink-1.9.2.tar.gz




svn commit: r37997 - in /dev/flink/flink-1.9.2-rc1: ./ apache-flink-1.9.2.tar.gz apache-flink-1.9.2.tar.gz.asc apache-flink-1.9.2.tar.gz.sha512

2020-02-11 Thread jincheng
Author: jincheng
Date: Tue Feb 11 13:42:58 2020
New Revision: 37997

Log:
Add PyFlink 1.9.2 RC1

Added:
dev/flink/flink-1.9.2-rc1/
dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz   (with props)
dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.asc
dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.sha512

Added: dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz
==
Binary file - no diff available.

Propchange: dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.asc
==
--- dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.asc (added)
+++ dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.asc Tue Feb 11 13:42:58 
2020
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIzBAABCAAdFiEEj+oe6dAEjAzMcLdXMhGwcDt56g4FAl49JVMACgkQMhGwcDt5
+6g7mWw//UkNQH7s2cT+1feNdc2JRKHg0zC3SdHiHbuOrdaUngUFjeXbYwUo28/q6
+GztQK4bE+m/D/lFlR0mHKSkGZG4hq1pL0NMYvbDlvkxB8RzNRO4zOQL/NVv7UhJU
+/iJRe0KoHAxceNLqtrmPepXP8IU9IepciJto4IR6ORdAuBdMb10WqFQfn0R4NPyf
+igR0OqMQDvm36aXTKDvDmWK6AXqzhTM0jpyXvUxuh0UF7lrkJez+l6JX6DhXGs7Q
+eUVZT2sBxeZ0cqL/tocs4gYy+JxBrd8eZPv60R9oQYSonVBnoHENlRjl6qbPgY7A
+14VD0uCAwRboEggODNDezdVKoby38IjsrvsVQ8RwnwBHEfl1TJt8pM7NCLcylZUV
+kd0hjjPlAllckmWKpXsAAS7xgy0abl2qWvbJm15HgGya4CWbtKC+7upHxNIn7HHn
+ZhUIukf3/AI6sKR1aQE7O9vnZvb5rblNk0Ljoy7sug1VD3F8IpBoUGDC0srckFUb
+6jPrtj4XqMXRwtr9Ka0t0K+akaQv1kX6BTkxq2bbOoxIciXnUiIVwKLPsZrtPZnY
+yqoxw9hrU6cSXoP3jy7j5/y3luVBxxfpGkEZEi77D74rL8fI0GEKQce04b/+B1iw
+mhtFqxnD2K75s/i6ZfI0AhaKOjrNoPegpzip5tHMScCgQfRefFs=
+=fegp
+-END PGP SIGNATURE-

Added: dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.sha512
==
--- dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.sha512 (added)
+++ dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.sha512 Tue Feb 11 
13:42:58 2020
@@ -0,0 +1 @@
+e0e7c2b770b61264b72f0a3de744ded4734460fe47d2e5baa68d679d9acaa23f92c28d2f50c567c7026259d52dd0255f536ade7598915def0be24279e0f265f8
  apache-flink-1.9.2.tar.gz




svn commit: r37929 - in /dev/flink/flink-1.9.2-rc1: ./ apache-flink-1.9.2.tar.gz apache-flink-1.9.2.tar.gz.asc apache-flink-1.9.2.tar.gz.sha512

2020-02-07 Thread jincheng
Author: jincheng
Date: Fri Feb  7 09:02:14 2020
New Revision: 37929

Log:
Add PyFlink 1.9.2 RC1

Added:
dev/flink/flink-1.9.2-rc1/
dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz   (with props)
dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.asc
dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.sha512

Added: dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz
==
Binary file - no diff available.

Propchange: dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.asc
==
--- dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.asc (added)
+++ dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.asc Fri Feb  7 09:02:14 
2020
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIzBAABCAAdFiEEj+oe6dAEjAzMcLdXMhGwcDt56g4FAl49JVMACgkQMhGwcDt5
+6g7mWw//UkNQH7s2cT+1feNdc2JRKHg0zC3SdHiHbuOrdaUngUFjeXbYwUo28/q6
+GztQK4bE+m/D/lFlR0mHKSkGZG4hq1pL0NMYvbDlvkxB8RzNRO4zOQL/NVv7UhJU
+/iJRe0KoHAxceNLqtrmPepXP8IU9IepciJto4IR6ORdAuBdMb10WqFQfn0R4NPyf
+igR0OqMQDvm36aXTKDvDmWK6AXqzhTM0jpyXvUxuh0UF7lrkJez+l6JX6DhXGs7Q
+eUVZT2sBxeZ0cqL/tocs4gYy+JxBrd8eZPv60R9oQYSonVBnoHENlRjl6qbPgY7A
+14VD0uCAwRboEggODNDezdVKoby38IjsrvsVQ8RwnwBHEfl1TJt8pM7NCLcylZUV
+kd0hjjPlAllckmWKpXsAAS7xgy0abl2qWvbJm15HgGya4CWbtKC+7upHxNIn7HHn
+ZhUIukf3/AI6sKR1aQE7O9vnZvb5rblNk0Ljoy7sug1VD3F8IpBoUGDC0srckFUb
+6jPrtj4XqMXRwtr9Ka0t0K+akaQv1kX6BTkxq2bbOoxIciXnUiIVwKLPsZrtPZnY
+yqoxw9hrU6cSXoP3jy7j5/y3luVBxxfpGkEZEi77D74rL8fI0GEKQce04b/+B1iw
+mhtFqxnD2K75s/i6ZfI0AhaKOjrNoPegpzip5tHMScCgQfRefFs=
+=fegp
+-END PGP SIGNATURE-

Added: dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.sha512
==
--- dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.sha512 (added)
+++ dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.sha512 Fri Feb  7 
09:02:14 2020
@@ -0,0 +1 @@
+e0e7c2b770b61264b72f0a3de744ded4734460fe47d2e5baa68d679d9acaa23f92c28d2f50c567c7026259d52dd0255f536ade7598915def0be24279e0f265f8
  apache-flink-1.9.2.tar.gz




svn commit: r37928 - in /dev/flink: flink-1.9.2-rc0/ flink-1.9.2-rc1/

2020-02-07 Thread jincheng
Author: jincheng
Date: Fri Feb  7 08:58:35 2020
New Revision: 37928

Log:
Remove old release candidates for Apache Flink 1.9.2

Removed:
dev/flink/flink-1.9.2-rc0/
dev/flink/flink-1.9.2-rc1/



svn commit: r37919 - in /dev/flink/flink-1.9.2-rc1: ./ apache-flink-1.9.2.tar.gz apache-flink-1.9.2.tar.gz.asc apache-flink-1.9.2.tar.gz.sha512

2020-02-06 Thread jincheng
Author: jincheng
Date: Fri Feb  7 04:00:03 2020
New Revision: 37919

Log:
Add 1.9.2 rc1

Added:
dev/flink/flink-1.9.2-rc1/
dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz   (with props)
dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.asc
dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.sha512

Added: dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz
==
Binary file - no diff available.

Propchange: dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.asc
==
--- dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.asc (added)
+++ dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.asc Fri Feb  7 04:00:03 
2020
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIzBAABCAAdFiEEj+oe6dAEjAzMcLdXMhGwcDt56g4FAl483/IACgkQMhGwcDt5
+6g7wpBAAuvw/7JuX7ME8MBsp62KSBIPfLFDfsodtWSIav25EH0YlVpM4C2wcuxty
+E5LpL3793hvOWu6o2h189X7tKNPdd6lRL1WYE2twRglT8p+70Rtm2yx79h0o4FKk
+3YhGNYo6J5Dm+WplxtmFYUYKiOA3Q0vFG4GRLT45HhhgT+cMmDBvSeJLbMdo9lM4
+S5AndSEKfoEp6DftMPz1/53xak7JTzoHtHWzr4WWx/gONGOgr+mGWd/fsxHkMUo0
+GQ98+htfS8ERj8v+wljX07TOkAOosr3sX1CmE6XsXfadejAZ7202/8vROudKtoV/
+uEfw6zwX3rKSddLazZ3U3Yn2QJn8ATW04ZYoV4np2wuMjkvrPCuDj8vMp8PAz8pn
+wPQpTW84WwvqjTP8QdHcxLyt7vNUWMR0WoW9xOfMgfsy18danO0vHCcCUgT5qdSy
+U+7nowqYEmOQQWbOEFEIh1ZBc76geN14iAZGw4Id0To+FI0hWGqx1RiAt5fkZzYq
+DjSxNONQb033l0mLE9r1JqL/UjXj/0MYrCBK4ffjt4IOc+k4jMmO9/lpKZ6nPanZ
+flEKqfI6YabMoEyFGQkNhNyUv+2ldpPAOF3hiMH3X9WZ8cJuwcwZ1M8zgjmFJM4w
+YXo8+U1k5orWNxeccUAs3JPLOLBjZKVCYu7z33rCWJC3J3uemUI=
+=a8sR
+-END PGP SIGNATURE-

Added: dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.sha512
==
--- dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.sha512 (added)
+++ dev/flink/flink-1.9.2-rc1/apache-flink-1.9.2.tar.gz.sha512 Fri Feb  7 
04:00:03 2020
@@ -0,0 +1 @@
+d88fdc8b849e00e0c64065a93328867c2c58114d5f063472fdf7fd6c15d2648a3f75dee4745ab5345d32d4a4fa8b5f2195c1674f55a103c91f744281a36258bc
  apache-flink-1.9.2.tar.gz




[flink] branch release-1.10 updated: [FLINK-15937][python] Update the Development Status to 5 - Production/Stable (#11028)

2020-02-06 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch release-1.10
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.10 by this push:
 new 62cfc80  [FLINK-15937][python] Update the Development Status to 5 - 
Production/Stable (#11028)
62cfc80 is described below

commit 62cfc8083b578d552d7063cbdbc60c68d6972669
Author: Jincheng Sun 
AuthorDate: Thu Feb 6 16:12:19 2020 +0800

[FLINK-15937][python] Update the Development Status to 5 - 
Production/Stable (#11028)
---
 flink-python/setup.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/flink-python/setup.py b/flink-python/setup.py
index 12d3af0..eb42f5f 100644
--- a/flink-python/setup.py
+++ b/flink-python/setup.py
@@ -230,7 +230,7 @@ run sdist.
 long_description=long_description,
 long_description_content_type='text/markdown',
 classifiers=[
-'Development Status :: 1 - Planning',
+'Development Status :: 5 - Production/Stable',
 'License :: OSI Approved :: Apache Software License',
 'Programming Language :: Python :: 3.5',
 'Programming Language :: Python :: 3.6',



[flink] branch master updated (3808858 -> 0acd349)

2020-02-06 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 3808858  [FLINK-15929][python] Update the version limit of grpcio to 
1.26.0
 add 0acd349  [FLINK-15937][python] Update the Development Status to 5 - 
Production/Stable (#11028)

No new revisions were added by this update.

Summary of changes:
 flink-python/setup.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[flink] branch release-1.10 updated: [FLINK-15921] [python] Add version range of grpcio. (#11024)

2020-02-05 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch release-1.10
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.10 by this push:
 new b106af7  [FLINK-15921] [python] Add version range of grpcio. (#11024)
b106af7 is described below

commit b106af7aeb5cf5c8b3036fed276800d425a0e8c5
Author: Jincheng Sun 
AuthorDate: Wed Feb 5 22:16:36 2020 +0800

[FLINK-15921] [python] Add version range of grpcio. (#11024)
---
 flink-python/tox.ini | 1 +
 1 file changed, 1 insertion(+)

diff --git a/flink-python/tox.ini b/flink-python/tox.ini
index 058f4f1..bfbcb84 100644
--- a/flink-python/tox.ini
+++ b/flink-python/tox.ini
@@ -28,6 +28,7 @@ whitelist_externals=
 /bin/bash
 deps =
 pytest
+grpcio>=1.3.5,<=1.14.2
 grpcio-tools>=1.3.5,<=1.14.2
 commands =
 python --version



[flink] branch master updated (cf126ff -> c463358)

2020-02-05 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from cf126ff  [FLINK-15868] Pin snakeyaml dependency in 
flink-connector-elasticsearch5 to 1.25
 add c463358  [FLINK-15921] [python] Add version range of grpcio. (#11024)

No new revisions were added by this update.

Summary of changes:
 flink-python/tox.ini | 1 +
 1 file changed, 1 insertion(+)



svn commit: r37874 - in /dev/flink/flink-1.9.2-rc0: ./ apache-flink-1.9.2.tar.gz apache-flink-1.9.2.tar.gz.asc apache-flink-1.9.2.tar.gz.sha512

2020-02-04 Thread jincheng
Author: jincheng
Date: Wed Feb  5 05:51:18 2020
New Revision: 37874

Log:
Add PyFlink 1.9.2

Added:
dev/flink/flink-1.9.2-rc0/
dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz   (with props)
dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz.asc
dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz.sha512

Added: dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz
==
Binary file - no diff available.

Propchange: dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz.asc
==
--- dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz.asc (added)
+++ dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz.asc Wed Feb  5 05:51:18 
2020
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIzBAABCAAdFiEEj+oe6dAEjAzMcLdXMhGwcDt56g4FAl45LogACgkQMhGwcDt5
+6g6SAxAAwyxmwGc5b1NTJuTGDWdsfU0RGca14MMievG+a06hzWuZlaZc75bYiWOP
+NzQNmnMrfg2ZhXMAffQK0ipEM/YrZSBdom4ES9ZZc6kIRwwzALTJ34kkmBNyZ6W1
+S2t1Gu0vgzBtZFMm1EqyIjJxIehzGwsS20Gn+sSFoN/OgcXdxBxfrTE7IMeWRxPy
+sbdFB4TbAwyGTPTG5RlGO1OcNQ66ANBuzL+20ajDylbjYIIUB+CrLs0+cWh0NV1H
+0hYSmv/wZITga+8iYzj3OfRoqR0xF/fn9TGvn0Z4GQXETrKnPusjpbjH+0RRcrgd
+qfq6GEyNx2wPBIzC+IwAil+juZlolqp4gt5qwsoknY++J6XCIkvd7CxWzD9t9rQj
+p9rdYpnaTkPpy0GZjOeOZ/azBRaRFLL7pC5+oQF8Hq4EH1ECkFQT3W7GuTCRiCco
+zBjg3FyVJGk1ugv1yt+R7viUv8ezcAmWOXQuRJZft3Tk6yAzt+Yd0EPGt6hXF1dA
+BsM0+aBJJafNpQPIapj+i1GdrPX2mb8Q2mqPKckSWM6qaYO3tnW2nSIELBb3lcqj
+GXqXncXNZrEJ+VoqZqrbPvZwsO4V8GJDN4BpCkUxHEouuR8xldS7B3c35YzePvpn
+JQGx4t8B2hGJ38HasXW0xPjqrwytca/Fa9D8g/7nJ6yv4YTzw6E=
+=zu1Q
+-END PGP SIGNATURE-

Added: dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz.sha512
==
--- dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz.sha512 (added)
+++ dev/flink/flink-1.9.2-rc0/apache-flink-1.9.2.tar.gz.sha512 Wed Feb  5 
05:51:18 2020
@@ -0,0 +1 @@
+d4d5bb3d4b4ce4bfb910eb5f8622322e23f416ad13140f29e29a550d7ab780c01bf0c6d8d26c31476b0c5566d40437dfe2015d5b9a94def217e518ab5e73f366
  apache-flink-1.9.2.tar.gz




svn commit: r37793 - /release/flink/flink-1.9.1/

2020-01-29 Thread jincheng
Author: jincheng
Date: Thu Jan 30 06:36:39 2020
New Revision: 37793

Log:
Remove flink-1.9.1

Removed:
release/flink/flink-1.9.1/



svn commit: r37792 - /dev/flink/flink-1.9.2-rc1/ /release/flink/flink-1.9.2/

2020-01-29 Thread jincheng
Author: jincheng
Date: Thu Jan 30 06:26:44 2020
New Revision: 37792

Log:
Release Flink 1.9.2

Added:
release/flink/flink-1.9.2/
  - copied from r37791, dev/flink/flink-1.9.2-rc1/
Removed:
dev/flink/flink-1.9.2-rc1/



[flink] branch release-1.10 updated: [hotfix] [python] Change the text of PyPI Author from `Flink Developers` to `Apache Software Foundation` (#10799)

2020-01-08 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch release-1.10
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.10 by this push:
 new 6e26f75  [hotfix] [python] Change the text of PyPI Author from `Flink 
Developers` to `Apache Software Foundation` (#10799)
6e26f75 is described below

commit 6e26f753a8857c70cf308b705843c73c38b303c8
Author: Jincheng Sun 
AuthorDate: Wed Jan 8 17:07:44 2020 +0800

[hotfix] [python] Change the text of PyPI Author from `Flink Developers` to 
`Apache Software Foundation` (#10799)
---
 flink-python/setup.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/flink-python/setup.py b/flink-python/setup.py
index a7c3620..12d3af0 100644
--- a/flink-python/setup.py
+++ b/flink-python/setup.py
@@ -220,7 +220,7 @@ run sdist.
 scripts=scripts,
 url='https://flink.apache.org',
 license='https://www.apache.org/licenses/LICENSE-2.0',
-author='Flink Developers',
+author='Apache Software Foundation',
 author_email='d...@flink.apache.org',
 python_requires='>=3.5',
 install_requires=['py4j==0.10.8.1', 'python-dateutil==2.8.0', 
'apache-beam==2.15.0',



[flink] branch master updated (6132f81 -> 039955c)

2020-01-08 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 6132f81  [hofix] Improve failure message of shuffle memory sanity check
 add 039955c  [hotfix] [python] Change the text of PyPI Author from `Flink 
Developers` to `Apache Software Foundation` (#10799)

No new revisions were added by this update.

Summary of changes:
 flink-python/setup.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



svn commit: r37171 - /release/flink/flink-1.8.2/

2019-12-10 Thread jincheng
Author: jincheng
Date: Wed Dec 11 03:52:44 2019
New Revision: 37171

Log:
Remove old release

Removed:
release/flink/flink-1.8.2/



svn commit: r37169 - /dev/flink/flink-1.8.3-rc3/ /release/flink/flink-1.8.3/

2019-12-10 Thread jincheng
Author: jincheng
Date: Wed Dec 11 02:55:14 2019
New Revision: 37169

Log:
Release Flink 1.8.3

Added:
release/flink/flink-1.8.3/
  - copied from r37168, dev/flink/flink-1.8.3-rc3/
Removed:
dev/flink/flink-1.8.3-rc3/



[flink] branch master updated: [FLINK-14198][python] Add "type" and "rtype" options to flink python API docstrings of table.py and table_environment.py

2019-12-06 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 68bba73  [FLINK-14198][python] Add "type" and "rtype" options to flink 
python API docstrings of table.py and table_environment.py
68bba73 is described below

commit 68bba73ab705b7efe9c984235dbf0c20f64c375c
Author: Wei Zhong 
AuthorDate: Tue Dec 3 10:17:42 2019 +0800

[FLINK-14198][python] Add "type" and "rtype" options to flink python API 
docstrings of table.py and table_environment.py

This closes #10389
---
 flink-python/pyflink/table/table.py | 235 +++-
 flink-python/pyflink/table/table_environment.py | 203 
 2 files changed, 278 insertions(+), 160 deletions(-)

diff --git a/flink-python/pyflink/table/table.py 
b/flink-python/pyflink/table/table.py
index 2feaf52..5285784 100644
--- a/flink-python/pyflink/table/table.py
+++ b/flink-python/pyflink/table/table.py
@@ -20,7 +20,6 @@ from py4j.java_gateway import get_method
 from pyflink.java_gateway import get_gateway
 from pyflink.table.table_schema import TableSchema
 
-from pyflink.table.window import GroupWindow
 from pyflink.util.utils import to_jarray
 
 __all__ = ['Table', 'GroupedTable', 'GroupWindowedTable', 'OverWindowedTable', 
'WindowGroupedTable']
@@ -29,11 +28,11 @@ __all__ = ['Table', 'GroupedTable', 'GroupWindowedTable', 
'OverWindowedTable', '
 class Table(object):
 
 """
-A :class:`Table` is the core component of the Table API.
+A :class:`~pyflink.table.Table` is the core component of the Table API.
 Similar to how the batch and streaming APIs have DataSet and DataStream,
-the Table API is built around :class:`Table`.
+the Table API is built around :class:`~pyflink.table.Table`.
 
-Use the methods of :class:`Table` to transform data.
+Use the methods of :class:`~pyflink.table.Table` to transform data.
 
 Example:
 ::
@@ -70,7 +69,9 @@ class Table(object):
 >>> tab.select("key, value + 'hello'")
 
 :param fields: Expression string.
-:return: The result :class:`Table`.
+:type fields: str
+:return: The result table.
+:rtype: pyflink.table.Table
 """
 return Table(self._j_table.select(fields))
 
@@ -85,7 +86,9 @@ class Table(object):
 >>> tab.alias("a, b")
 
 :param fields: Field list expression string.
-:return: The result :class:`Table`.
+:type fields: str
+:return: The result table.
+:rtype: pyflink.table.Table
 """
 return Table(get_method(self._j_table, "as")(fields))
 
@@ -100,7 +103,9 @@ class Table(object):
 >>> tab.filter("name = 'Fred'")
 
 :param predicate: Predicate expression string.
-:return: The result :class:`Table`.
+:type predicate: str
+:return: The result table.
+:rtype: pyflink.table.Table
 """
 return Table(self._j_table.filter(predicate))
 
@@ -115,7 +120,9 @@ class Table(object):
 >>> tab.where("name = 'Fred'")
 
 :param predicate: Predicate expression string.
-:return: The result :class:`Table`.
+:type predicate: str
+:return: The result table.
+:rtype: pyflink.table.Table
 """
 return Table(self._j_table.where(predicate))
 
@@ -130,7 +137,9 @@ class Table(object):
 >>> tab.group_by("key").select("key, value.avg")
 
 :param fields: Group keys.
-:return: The grouped :class:`Table`.
+:type fields: str
+:return: The grouped table.
+:rtype: pyflink.table.GroupedTable
 """
 return GroupedTable(self._j_table.groupBy(fields))
 
@@ -143,20 +152,21 @@ class Table(object):
 
 >>> tab.select("key, value").distinct()
 
-:return: The result :class:`Table`.
+:return: The result table.
+:rtype: pyflink.table.Table
 """
 return Table(self._j_table.distinct())
 
 def join(self, right, join_predicate=None):
 """
-Joins two :class:`Table`. Similar to a SQL join. The fields of the two 
joined
+Joins two :class:`~pyflink.table.Table`. Similar to a SQL join. The 
fields of the two joined
 operations must not overlap, use :func:`~pyflink.table.Table.alias` to 
rename fields if
 necessary. You can use where and select clauses after a join to 
further specify the
 behaviour of the join.
 
 .. note::
 
-Both tables must be bound to the same :cla

svn commit: r36954 - /release/flink/KEYS

2019-11-26 Thread jincheng
Author: jincheng
Date: Wed Nov 27 03:19:26 2019
New Revision: 36954

Log:
Add he...@apache.org to KEYS file.

Modified:
release/flink/KEYS

Modified: release/flink/KEYS
==
--- release/flink/KEYS (original)
+++ release/flink/KEYS Wed Nov 27 03:19:26 2019
@@ -1645,3 +1645,63 @@ iDiNryfXiNtbbRW/XZ3sWEfuzWSZkjgfYPYHyE14
 OKMsp9EJ7so=
 =/Nn9
 -END PGP PUBLIC KEY BLOCK-
+
+pub   rsa4096 2019-11-27 [SC]
+  EF88474C564C7A608A822EEC3FF96A2057B6476C
+uid   [ultimate] Hequn Cheng (CODE SIGNING KEY) 
+sig 33FF96A2057B6476C 2019-11-27  Hequn Cheng (CODE SIGNING KEY) 

+sub   rsa4096 2019-11-27 [E]
+sig  3FF96A2057B6476C 2019-11-27  Hequn Cheng (CODE SIGNING KEY) 

+
+-BEGIN PGP PUBLIC KEY BLOCK-
+
+mQINBF3d3kUBEADQFrKhRIlShRd3sB64S8TUPfYptGywv3CwLdBNOQYHSpx0qB8a
+SMs8HzJYpmK2/uNTrm3gTmTZmb48pVDixe03qOqLhNAWEkr88lnXopxB3vfwzrlO
+Ql+2RDKPstLq93PmIOPweOo0dCzchhg+qhvVzeE4hVnbbZn6KcLqnHdnpyTQFy0m
+buFAZ/b7vdgqVbjamPHYRsiAUOptt9NwjRLhotob2ipXITUCX3vUMEG5nOaSZajo
+JucWc1wrn9Lmgsq/yxthPSp24A9Ea20pvttAAD+ZTW+EtTm+jaMNK/pn6pO54u6u
+Mz5fd3hv52OCoXLOhoYQDlwL1L+YQAtWip2Mo7sZqcVwGoAq1NGePcNCFdEDg7qh
+LzfSMlxuStYwMwe7n076/nI4tX6gtoi1guvaB8r9M/SDTsCOSeTzsu5hhPB9QISg
+LmQT/UAWP1UjpbF4LuiOsargwdjZq34DWI8S4d4h7ONzxKhQ3bi4JtfF6JXw/Exd
+sotytM1/Nfh6rY9a59JoWtb1gicSgAzc0Le+dV2i/uFD89/5AMMmtG9DXELEXPR8
+6SM5BiNoKDLvRsgdWT4FY4ySnCUo12YbfgZCTlqg2obPAUrzd/WyIWAO4Ve2P8Aa
+2VmCjWgTI929e0n8PoDId0MdhlGTvyjIpFT/Xf6fcrWMMyPMK9AY6OTR7QARAQAB
+tDFIZXF1biBDaGVuZyAoQ09ERSBTSUdOSU5HIEtFWSkgPGhlcXVuQGFwYWNoZS5v
+cmc+iQJOBBMBCAA4FiEE74hHTFZMemCKgi7sP/lqIFe2R2wFAl3d3kUCGwMFCwkI
+BwIGFQoJCAsCBBYCAwECHgECF4AACgkQP/lqIFe2R2wstw//fDEdNPtuDOs+Kpel
+KPHeugMOcuF6801KEn6Zon4ndUfEk1pXdjBdtTU54qgUilRG8CIXkkPsvfqoc+pV
+2mX8bMzajdYfjdIu/YpgL1JjRVvx1qZmQQkO0FECKyFrz/QxIuW/nPQhburkFn6F
+bc7JJ4G8yyIXXlfT7I4juPbgIDBcx/vXpZJE1jvmrRX8F18NjkytEWC66EHlzniF
+eujecIbPLj84wSTtSJISQtuxxhfuS2mEsn9ac3fdBbZ2u2ZFLY82tsEbP1TQKI2T
+MzbN+uJuT8h8inIsHayAaa6gXhnMa5ZjdUmDIKeOCwbDrCOnVpQy1j/XSPHofiuE
+uDZ+jvIrb30xSnFOVWhWqAivyj6RnSBFFNrj/s2TDZ/Jh2nuhTQ1wT3xn+uNXCm2
+H8B2a0Os6q8M//nIPCwAPccq/wfkWfG3ujLg0LByS3VGIEQSYqU3K5Gv/mwMCUK5
+DWjFUoZZTCn75/jPezExXEWJbhq8yS/X0qDgSjX0HNaUNmnfYG3Guci0eixtKj33
+6aXb5baeuZ5E/K49irXJZt+bWRSi7OIW0+gCVI8IIaXTCgynscYZHtb7GxjrUUs/
+KI1w6iNfp/1WXsyr/mjAlr9nyvr4HP1Kn9lqq8xsdHtic9mbyMK0WdhG42Z2detX
+Wyq5jaYNwdBUeRcVlCgf2azk1Fu5Ag0EXd3eRQEQALf87IhmxN6S+6T+gEBBvNp9
+9FzuD+Gh1wYikimWcCqEn+CBRNv6CRC2tbelRP+kf/whnZkIKIZo5wAX7pf3sYUq
+b/1ewwmgUw2NULhHTXM4BN0nIlMFHS2MN+TsWSt6sreoMkQNrOxbawdw4ccaOKUD
+6vg4DuZ5p5xV0v1WlIsG81auFuHzmw+xxCbZI+Sxkx+YgY0qThrEADHB7l/dd2N9
+5SMYnp1JwGXqUQTfoVw6DCw5w/appqxDdsVzag3zpGdw866FMTBd43T8ucsGiTK9
+8WOnHQquX/VOp+DYxlIRvcgiFyRdux5XbU79xue8k4rOu91ONMqCXmO92IahmP8T
+dYcnH+eqaUWpaJOHT2JJbMBOSclUOjVwzgC31mQ51DLbvuc7Adhvsvju/TGR5Zog
+k/Q3RVibhtqBYShzlZC0fWbrQWmY9kXxRySmEY8MzCwlgS3EEnT6jdxfm36DjLZf
+w18aannaVejPZZdhRindyKcAY8N13ubWkWs75oxRTD7EulqcyYxMmWzxNBAI4Ymf
+eJZlFuvxoLvOPRJghVGihGDiU5QS9dC8kcR4Rs/0Om4gWftAogoE1+EzSJQPMkEG
+mVq2WdZ14dpIuJIAH23XcnffDiM0nExOpefJ4GKhFfeXK3Ly5LCRiFq8LOcFYpbC
+6XbEIgakKFe3+LQaN/HdABEBAAGJAjYEGAEIACAWIQTviEdMVkx6YIqCLuw/+Wog
+V7ZHbAUCXd3eRQIbDAAKCRA/+WogV7ZHbMJ2EACwowleeacT41gPGl6nMxSY7kx0
+EhiQGWSOJtSH+uyujSufx97PcwIc1OpZI/A+MRAtUXYHNaaQoY/j9eVo4O3v/+ox
+kobT6ht9oZA71euLnhF5bWGMinr253gi7eroNHzWl/2t7v9e0MEJf047VmN2qsOY
+9xGr7xNfRLRQ18mo3x40ZuJVGNRGvL93pXbtr4l5JTixUUADZvQw7NzQEapcA1y0
+tCroDq/o3yU/qPxb0Y+WLHzcPUThnwE19xNCRAsi3liGRcx1pTxOQX9+kfEsYzpP
+/YB9FwigkqtkaorcW9FgppNbTX6bE0ZJv6zCLcYg+RGMQK955uudL+NbUNqLvTnP
+v6z/mQ0BEMbLj9TqZd5FuslrkufE0ci1d/kSzKsWgLOF7TZY6Hx+OoDYgPtAt1nB
+zsaENIfhQepaRzWYm646WR6g1J4/9NeuUGduYBqFyMxcCDB9ekq3izS+eAXKOI9X
+Xeyq+Ovs77gKbLbl98V5ykiAIx1+5M8JWbTo46Z7JCFV21sFDmrEbxOA/2hH11XY
+VpuYpCqTrp5aTg9CFs/X/RHxsQlzrlXcOZDW68IOT6Jpx72XQPf9DBbdXlaoHBa9
+aC5mzB+jlndEpImm7oChaab88yUadN63BNWRgZ+n2t1y0sb9Hd2gwj+e9bJ3stjj
+lI+DKKtNmX190sJARg==
+=47gW
+-END PGP PUBLIC KEY BLOCK-




svn commit: r36390 - /release/flink/flink-1.9.0/

2019-10-19 Thread jincheng
Author: jincheng
Date: Sat Oct 19 12:09:27 2019
New Revision: 36390

Log:
Remove files for 1.9.0 release

Removed:
release/flink/flink-1.9.0/



svn commit: r36388 - /dev/flink/flink-1.9.1-rc1/ /release/flink/flink-1.9.1/

2019-10-19 Thread jincheng
Author: jincheng
Date: Sat Oct 19 11:53:53 2019
New Revision: 36388

Log:
Release Flink 1.9.1

Added:
release/flink/flink-1.9.1/
  - copied from r36387, dev/flink/flink-1.9.1-rc1/
Removed:
dev/flink/flink-1.9.1-rc1/



[flink-web] branch asf-site updated: Update release 1.8.2 blog

2019-09-12 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git


The following commit(s) were added to refs/heads/asf-site by this push:
 new c048fa4  Update release 1.8.2 blog
c048fa4 is described below

commit c048fa409285b65c6a1ab9bc8a4ec428f9714cad
Author: sunjincheng121 
AuthorDate: Thu Sep 12 16:17:29 2019 +0800

Update release 1.8.2 blog
---
 _config.yml|  2 +-
 ...elease-1.8.2.md => 2019-09-11-release-1.8.2.md} |  4 ++--
 content/blog/index.html| 28 +++---
 content/blog/page2/index.html  |  4 ++--
 content/blog/page3/index.html  |  4 ++--
 content/blog/page4/index.html  |  4 ++--
 content/blog/page5/index.html  |  4 ++--
 content/blog/page6/index.html  |  4 ++--
 content/blog/page7/index.html  |  4 ++--
 content/blog/page8/index.html  |  4 ++--
 content/blog/page9/index.html  |  4 ++--
 content/index.html |  8 +++
 content/news/2019/09/{03 => 11}/release-1.8.2.html |  2 +-
 content/zh/index.html  |  8 +++
 14 files changed, 42 insertions(+), 42 deletions(-)

diff --git a/_config.yml b/_config.yml
index abf2db3..d5b1eb7 100644
--- a/_config.yml
+++ b/_config.yml
@@ -308,7 +308,7 @@ release_archive:
   -
 version_short: 1.8
 version_long: 1.8.2
-release_date: 2019-09-10
+release_date: 2019-09-11
   -
 version_short: 1.8
 version_long: 1.8.1
diff --git a/_posts/2019-09-03-release-1.8.2.md 
b/_posts/2019-09-11-release-1.8.2.md
similarity index 99%
rename from _posts/2019-09-03-release-1.8.2.md
rename to _posts/2019-09-11-release-1.8.2.md
index c777a21..ca54955 100644
--- a/_posts/2019-09-03-release-1.8.2.md
+++ b/_posts/2019-09-11-release-1.8.2.md
@@ -1,7 +1,7 @@
 ---
 layout: post
 title:  "Apache Flink 1.8.2 Released"
-date:   2019-09-03 12:00:00
+date:   2019-09-11 12:00:00
 categories: news
 authors:
 - jark:
@@ -93,4 +93,4 @@ List of resolved issues:
 
 [FLINK-12749] - 
Add Flink Operations Playground documentation
 
-
\ No newline at end of file
+
diff --git a/content/blog/index.html b/content/blog/index.html
index 6e12c01..da7d5ca 100644
--- a/content/blog/index.html
+++ b/content/blog/index.html
@@ -182,29 +182,29 @@
 
 
 
-  Flink Community Update - 
September'19
+  Apache Flink 1.8.2 Released
 
-  10 Sep 2019
-   Marta Paes (https://twitter.com/morsapaes;>@morsapaes)
+  11 Sep 2019
+   Jark Wu (https://twitter.com/JarkWu;>@JarkWu)
 
-  This has been an exciting, fast-paced year for the Apache Flink 
community. But with over 10k messages across the mailing lists, 3k Jira tickets 
and 2k pull requests, it is not easy to keep up with the latest state of the 
project. Plus everything happening around it. With that in mind, we want to 
bring back regular community updates to the Flink blog.
+  The Apache Flink community released the second bugfix version of 
the Apache Flink 1.8 series.
 
-  Continue reading 

+
+
+  Continue reading 

 
 
 
 
 
-  Apache Flink 1.8.2 Released
-
-  03 Sep 2019
-   Jark Wu (https://twitter.com/JarkWu;>@JarkWu)
+  Flink Community Update - 
September'19
 
-  The Apache Flink community released the second bugfix version of 
the Apache Flink 1.8 series.
+  10 Sep 2019
+   Marta Paes (https://twitter.com/morsapaes;>@morsapaes)
 
-
+  This has been an exciting, fast-paced year for the Apache Flink 
community. But with over 10k messages across the mailing lists, 3k Jira tickets 
and 2k pull requests, it is not easy to keep up with the latest state of the 
project. Plus everything happening around it. With that in mind, we want to 
bring back regular community updates to the Flink blog.
 
-  Continue reading 

+  Continue reading 

 
 
 
@@ -350,7 +350,7 @@
 
 
   
-  Flink Community 
Update - September'19
+  Apache Flink 1.8.2 
Released
 
   
 
@@ -360,7 +360,7 @@
   
 
   
-  Apache Flink 1.8.2 
Released
+  Flink Community 
Update - September'19
 
   
 
diff --git a/content/blog/page2/index.html b/content/blog/page2/index.html
index 80ac267..adc4636 100644
--- a/content/blog/page2/index.html
+++ b/content/blog/page2/index.html
@@ -359,7 +359,7 @@ for more details.
 
 
   
-  Flink Community 
Update - September'19
+  Apache Flink 1.8.2 
Released
 
   
 
@@ -369,7 +369,7 @@ for more details.
   
 
   
-  Apache Flink 1.8.2 
Released
+  Flink Community 
Update - September'19
 
   
 
diff --git a/content/blog/page3/index.html b/content

[flink-web] 01/02: Add Apache Flink release 1.8.2

2019-09-12 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit 034cc7243e80f276dabc3c5f94ffc3b414752d4f
Author: Jark Wu 
AuthorDate: Thu Sep 5 12:36:24 2019 +0800

Add Apache Flink release 1.8.2
---
 _config.yml| 56 +++---
 _posts/2019-09-03-release-1.8.2.md | 96 ++
 2 files changed, 126 insertions(+), 26 deletions(-)

diff --git a/_config.yml b/_config.yml
index 8529ec2..abf2db3 100644
--- a/_config.yml
+++ b/_config.yml
@@ -129,23 +129,23 @@ flink_releases:
   -
   version_short: 1.8
   binary_release:
-  name: "Apache Flink 1.8.1"
+  name: "Apache Flink 1.8.2"
   scala_211:
-  id: "181-download_211"
-  url: 
"https://www.apache.org/dyn/closer.lua/flink/flink-1.8.1/flink-1.8.1-bin-scala_2.11.tgz;
-  asc_url: 
"https://www.apache.org/dist/flink/flink-1.8.1/flink-1.8.1-bin-scala_2.11.tgz.asc;
-  sha512_url: 
"https://www.apache.org/dist/flink/flink-1.8.1/flink-1.8.1-bin-scala_2.11.tgz.sha512;
+  id: "182-download_211"
+  url: 
"https://www.apache.org/dyn/closer.lua/flink/flink-1.8.2/flink-1.8.2-bin-scala_2.11.tgz;
+  asc_url: 
"https://www.apache.org/dist/flink/flink-1.8.2/flink-1.8.2-bin-scala_2.11.tgz.asc;
+  sha512_url: 
"https://www.apache.org/dist/flink/flink-1.8.2/flink-1.8.2-bin-scala_2.11.tgz.sha512;
   scala_212:
-  id: "181-download_212"
-  url: 
"https://www.apache.org/dyn/closer.lua/flink/flink-1.8.1/flink-1.8.1-bin-scala_2.12.tgz;
-  asc_url: 
"https://www.apache.org/dist/flink/flink-1.8.1/flink-1.8.1-bin-scala_2.12.tgz.asc;
-  sha512_url: 
"https://www.apache.org/dist/flink/flink-1.8.1/flink-1.8.1-bin-scala_2.12.tgz.sha512;
+  id: "182-download_212"
+  url: 
"https://www.apache.org/dyn/closer.lua/flink/flink-1.8.2/flink-1.8.2-bin-scala_2.12.tgz;
+  asc_url: 
"https://www.apache.org/dist/flink/flink-1.8.2/flink-1.8.2-bin-scala_2.12.tgz.asc;
+  sha512_url: 
"https://www.apache.org/dist/flink/flink-1.8.2/flink-1.8.2-bin-scala_2.12.tgz.sha512;
   source_release:
-  name: "Apache Flink 1.8.1"
-  id: "181-download-source"
-  url: 
"https://www.apache.org/dyn/closer.lua/flink/flink-1.8.1/flink-1.8.1-src.tgz;
-  asc_url: 
"https://www.apache.org/dist/flink/flink-1.8.1/flink-1.8.1-src.tgz.asc;
-  sha512_url: 
"https://www.apache.org/dist/flink/flink-1.8.1/flink-1.8.1-src.tgz.sha512;
+  name: "Apache Flink 1.8.2"
+  id: "182-download-source"
+  url: 
"https://www.apache.org/dyn/closer.lua/flink/flink-1.8.2/flink-1.8.2-src.tgz;
+  asc_url: 
"https://www.apache.org/dist/flink/flink-1.8.2/flink-1.8.2-src.tgz.asc;
+  sha512_url: 
"https://www.apache.org/dist/flink/flink-1.8.2/flink-1.8.2-src.tgz.sha512;
   optional_components:
 -
   name: "Pre-bundled Hadoop 2.4.1"
@@ -183,26 +183,26 @@ flink_releases:
   name: "Avro SQL Format"
   category: "SQL Formats"
   scala_dependent: false
-  id: 181-sql-format-avro
-  url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-avro/1.8.1/flink-avro-1.8.1.jar
-  asc_url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-avro/1.8.1/flink-avro-1.8.1.jar.asc
-  sha_url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-avro/1.8.1/flink-avro-1.8.1.jar.sha1
+  id: 182-sql-format-avro
+  url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-avro/1.8.2/flink-avro-1.8.2.jar
+  asc_url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-avro/1.8.2/flink-avro-1.8.2.jar.asc
+  sha_url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-avro/1.8.2/flink-avro-1.8.2.jar.sha1
 -
   name: "CSV SQL Format"
   category: "SQL Formats"
   scala_dependent: false
-  id: 181-sql-format-csv
-  url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-csv/1.8.1/flink-csv-1.8.1.jar
-  asc_url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-csv/1.8.1/flink-csv-1.8.1.jar.asc
-  sha_url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-csv/1.8.1/flink-csv-1.8.1.jar.sha1
+  id: 182-sql-format-csv
+  url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-csv/1.8.2/flink-csv-1.8.2.jar
+  asc_url: 
https://repo.maven.apache.org/maven2/org/apache/f

[flink-web] 02/02: Rebuild website

2019-09-12 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit 4c3a4c8b9dde0ff9aaa01c0f99cbd044bec5d2a4
Author: sunjincheng121 
AuthorDate: Thu Sep 12 15:10:06 2019 +0800

Rebuild website
---
 content/blog/index.html|  40 ++--
 content/blog/page2/index.html  |  40 ++--
 content/blog/page3/index.html  |  40 ++--
 content/blog/page4/index.html  |  38 ++--
 content/blog/page5/index.html  |  36 ++--
 content/blog/page6/index.html  |  36 ++--
 content/blog/page7/index.html  |  42 ++--
 content/blog/page8/index.html  |  48 +++--
 content/blog/page9/index.html  |  29 +++
 content/downloads.html |  29 ++-
 content/index.html |   8 +-
 content/news/2019/09/03/release-1.8.2.html | 319 +
 content/zh/downloads.html  |  33 ++-
 content/zh/index.html  |   8 +-
 14 files changed, 598 insertions(+), 148 deletions(-)

diff --git a/content/blog/index.html b/content/blog/index.html
index 06b3ff3..6e12c01 100644
--- a/content/blog/index.html
+++ b/content/blog/index.html
@@ -195,6 +195,21 @@
 
 
 
+  Apache Flink 1.8.2 Released
+
+  03 Sep 2019
+   Jark Wu (https://twitter.com/JarkWu;>@JarkWu)
+
+  The Apache Flink community released the second bugfix version of 
the Apache Flink 1.8 series.
+
+
+
+  Continue reading 

+
+
+
+
+
   Apache Flink 1.9.0 Release 
Announcement
 
   22 Aug 2019
@@ -303,21 +318,6 @@
 
 
 
-
-  Apache 
Flink's Application to Season of Docs
-
-  17 Apr 2019
-   Konstantin Knauf (https://twitter.com/snntrable;>@snntrable)
-
-  The Apache Flink community is happy to announce its application to 
the first edition of https://developers.google.com/season-of-docs/;>Season of Docs by 
Google. The program is bringing together Open Source projects and technical 
writers to raise awareness for and improve documentation of Open Source 
projects. While the community is continuously looking for new contributors to 
collaborate on our documentation, we would like to take this chance to work 
with one or  [...]
-
-
-
-  Continue reading 
-
-
-
-
 
 
 
@@ -360,6 +360,16 @@
   
 
   
+  Apache Flink 1.8.2 
Released
+
+  
+
+  
+
+  
+  
+
+  
   Apache Flink 1.9.0 
Release Announcement
 
   
diff --git a/content/blog/page2/index.html b/content/blog/page2/index.html
index 1021f57..80ac267 100644
--- a/content/blog/page2/index.html
+++ b/content/blog/page2/index.html
@@ -182,6 +182,21 @@
 
 
 
+  Apache 
Flink's Application to Season of Docs
+
+  17 Apr 2019
+   Konstantin Knauf (https://twitter.com/snntrable;>@snntrable)
+
+  The Apache Flink community is happy to announce its application to 
the first edition of https://developers.google.com/season-of-docs/;>Season of Docs by 
Google. The program is bringing together Open Source projects and technical 
writers to raise awareness for and improve documentation of Open Source 
projects. While the community is continuously looking for new contributors to 
collaborate on our documentation, we would like to take this chance to work 
with one or  [...]
+
+
+
+  Continue reading 
+
+
+
+
+
   Apache Flink 1.8.0 Release 
Announcement
 
   09 Apr 2019
@@ -312,21 +327,6 @@ for more details.
 
 
 
-
-  Apache Flink 1.7.1 Released
-
-  21 Dec 2018
-  
-
-  The Apache Flink community released the first bugfix version of 
the Apache Flink 1.7 series.
-
-
-
-  Continue reading 

-
-
-
-
 
 
 
@@ -369,6 +369,16 @@ for more details.
   
 
   
+  Apache Flink 1.8.2 
Released
+
+  
+
+  
+
+  
+  
+
+  
   Apache Flink 1.9.0 
Release Announcement
 
   
diff --git a/content/blog/page3/index.html b/content/blog/page3/index.html
index fc2f3cb..fdb6f56 100644
--- a/content/blog/page3/index.html
+++ b/content/blog/page3/index.html
@@ -182,6 +182,21 @@
 
 
 
+  Apache Flink 1.7.1 Released
+
+  21 Dec 2018
+  
+
+  The Apache Flink community released the first bugfix version of 
the Apache Flink 1.7 series.
+
+
+
+  Continue reading 

+
+
+
+
+
   Apache Flink 1.7.0 Release 
Announcement
 
   30 Nov 2018
@@ -318,21 +333,6 @@ Please check the https://issues.apache.org/jira/secure/ReleaseNote.jspa
 
 
 
-
-  Apache Flink 1.5.0 Release 
Announcement
-
-  25 May 2018
-   Fabian Hueske (https://twitter.com/fhueske;>@fhueske)
-
-  The Apache Flink community is thrilled to announce the 1.5.0 
release. Over the past 5 months, the Flink communit

[flink-web] branch asf-site updated (510b821 -> 4c3a4c8)

2019-09-12 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a change to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git.


from 510b821  Rebuild website
 new 034cc72  Add Apache Flink release 1.8.2
 new 4c3a4c8  Rebuild website

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 _config.yml| 56 +++--
 _posts/2019-09-03-release-1.8.2.md | 96 ++
 content/blog/index.html| 40 +
 content/blog/page2/index.html  | 40 +
 content/blog/page3/index.html  | 40 +
 content/blog/page4/index.html  | 38 ++---
 content/blog/page5/index.html  | 36 +---
 content/blog/page6/index.html  | 36 +---
 content/blog/page7/index.html  | 42 +-
 content/blog/page8/index.html  | 48 ++-
 content/blog/page9/index.html  | 29 +++
 content/downloads.html | 29 +--
 content/index.html |  8 +-
 .../09/03/release-1.8.2.html}  | 80 +-
 content/zh/downloads.html  | 33 +---
 content/zh/index.html  |  8 +-
 16 files changed, 445 insertions(+), 214 deletions(-)
 create mode 100644 _posts/2019-09-03-release-1.8.2.md
 copy content/news/{2018/03/08/release-1.4.2.html => 
2019/09/03/release-1.8.2.html} (71%)



svn commit: r35781 - /release/flink/flink-1.8.1/

2019-09-12 Thread jincheng
Author: jincheng
Date: Thu Sep 12 06:19:05 2019
New Revision: 35781

Log:
remove flink-1.8.1

Removed:
release/flink/flink-1.8.1/



svn commit: r35780 - /dev/flink/flink-1.8.2-rc1/ /release/flink/flink-1.8.2/

2019-09-11 Thread jincheng
Author: jincheng
Date: Thu Sep 12 04:31:27 2019
New Revision: 35780

Log:
Release Flink 1.8.2

Added:
release/flink/flink-1.8.2/
  - copied from r35779, dev/flink/flink-1.8.2-rc1/
Removed:
dev/flink/flink-1.8.2-rc1/



svn commit: r35539 - /release/flink/KEYS

2019-09-04 Thread jincheng
Author: jincheng
Date: Wed Sep  4 09:04:40 2019
New Revision: 35539

Log:
Add j...@apache.org to KEYS file

Modified:
release/flink/KEYS

Modified: release/flink/KEYS
==
--- release/flink/KEYS (original)
+++ release/flink/KEYS Wed Sep  4 09:04:40 2019
@@ -1586,3 +1586,62 @@ ibvlVAwx1YrQLPWw9Mx1w4zus5tS8ojlHukjKqqJ
 LxugXjuYxd7c7Dut00zsORdAST64aU8P9LE/SVNpGZhkHaCj5rtkug==
 =iACU
 -END PGP PUBLIC KEY BLOCK-
+pub   rsa4096 2019-09-04 [SC]
+  E2C45417BED5C104154F341085BACB5AEFAE3202
+uid   [ultimate] Jark Wu (CODE SIGNING KEY) 
+sig 385BACB5AEFAE3202 2019-09-04  Jark Wu (CODE SIGNING KEY) 

+sub   rsa4096 2019-09-04 [E]
+sig  85BACB5AEFAE3202 2019-09-04  Jark Wu (CODE SIGNING KEY) 

+
+-BEGIN PGP PUBLIC KEY BLOCK-
+
+mQINBF1vdWMBEACmuGdKHF+UshpTCTQWRssJuRdGD7z56EbKW6RE1sH4bSco31Yu
+ml7hNbJiyel+UhdXEFNbTl+EsgQI3rb+3rGW2GPwdnie/05X+hVQAFYPqnczrKtr
+L614WuNNfP3p8Ysi6vYHY1eMXY3awNZO0/khWRlEZWk4QwJVKR0rD6jpwJd6u+Qk
+wf6qo1S9zedraqPqyGOOVeqLwjBlTTsXnkE9+KJiDSzdU2pOFtz/I/0D05/ov/dH
+9r3boQZrrnnUc5Ie9Bzm5+12oAaQ6cyAVyHsdPv/e4v5n+4iJNWbukkj05a8r+My
+IRyJhcAzWe5ZSKV7kxK4g2SN2og5n9lEQFHeFj7DZfRYMrpzTSxojKZoLrwrIn1U
+NRbe7MzcbCWr7Ih2BJTz/+ekIYRnjpyrD4fkzVKcgGWpJWlgNbRtQvYdX9FyPieu
+3Os8O+vfIT7WC1ugt0jauoTXkIGVSp4/H7P7O1TXA547zXmpqDq8++ZsCFRoBCvV
+GUC67GuL/VqV1LDCjO1Sz97/+pmTGwjqHMpTGeiAQN62GpISdxOH8tHXoLTNTBPR
+KN7asBUBAGozDphkuniUsyjTOUNKClN3uy8iplTHQaChKbEgb1RHbgbKW+1zO3gk
+7vXobQLNB0/VcWZKHDevaiPWsaRZDNiEUQWjoVWn+ClSsAKoKdMz8tFQJQARAQAB
+tCxKYXJrIFd1IChDT0RFIFNJR05JTkcgS0VZKSA8amFya0BhcGFjaGUub3JnPokC
+TgQTAQgAOBYhBOLEVBe+1cEEFU80EIW6y1rvrjICBQJdb3VjAhsDBQsJCAcCBhUK
+CQgLAgQWAgMBAh4BAheAAAoJEIW6y1rvrjICSD8P/1lMgU5VoPKPtdqBJVVeVkBZ
+/VF8VcHxj8X7whV2qcWIfYQuWIrAe/r7SGdPtLXiSUUu13uj5ivJ4Qdygzc57nTZ
+O8iDd5gIQgjs1QUI1VXH0mtc3I89Jgt5gX191L3yqLmWTo1ebmOk146KXYG4GrEI
+qtEyxjJD/q76vDyTY/4ItDoaPr/mE9A/kgXA5jKY8qGJn2D/UiURAbVdYcNqCsZC
+o3EwOfmWrs0PwTdRSzjVZ9och6dDtBRMhfycyx7uf2NITab9t/Fxeb9gh4xy5k+c
+kXrtS3cbkXMx+vxRurlZiUb5ESoyCiN3P4LVeKw5RCIdCsA1WVsaKR31LR59bGmE
+qr2iO32Ip3iHaePvY/RDF9kMlZCfouye0OaTuPRhBPAcqwoWrP/rwWuE4Tm86/7x
+f2wLcOOiCXve3clUyYygVZ7984+NuJLY5NYWVFs98l6lkZGZnmovA7VmQ5T+/aY8
+c7PeP+WgS5jWdMXNhUu3sPGlfOU2PxY0DaHhNSbaj1PTPSBIhZhXxNeo4WTxVIBi
+FG8h01l5EK7Vz86zHrWBLCkt62dRH8uxIw4UEnNSR7PLX1OTtdaV3hKLcUWN/CPL
+Wmp7i/5VUTMcliYXpfAFiRIGFwTQVDZFaGKs0LWy9/2Bm3bqqmVpcsdMZL2n0n7J
+y0ljIHCsOVHk5M0Dncs2uQINBF1vdWMBEADZAEoXuEmWnFMikFuh1vIO+Ks1t6t+
+g5fWvniHQFbdEF0xjOGPFmU0236LXLEqkvI5B9ZaShMC7g13omWyEMe4ppyMG9Bu
+81nxUeYFhvpRBWgDN1f+YmQc+Dh7G3ekwzSEJOwUqzyNY0ZjC4Dd+0xStT1q1liE
+a7jszn9qIH9xlBPwdwIxc1qMKXiBy3DOzOArap8TEO4ZBUsvgI8yVcHm2FehT+XF
+NymGouA0g1gkiMwfP9QDCiYNMUcB5Y8ldL/ZEqzSc7xNXaqK0RtWgN24U6RTXD1z
+bZ/UbxNpn9d8b0o/OKd6hJkgUspi/OGQ18sKoa/duCYzweE42EnBU0qyuInTyEqB
+bFX1C+Pfvyb+7IRo+SeFBDFJWDyJHyi+sUhmIFfSEoY2MxzWQJ8JMU3kHOLVsSHE
+PyUkpCPIQR7of6zKkWi3jT5qHvmRgtWOhIsksXr2QCCxgd5LvqM29hhqRu59cjSG
+0pO0yW1rfXaL/LK9GhH7w69lgvtOZ5oBh2eF8q7e0Oq/KQBYXxD0j+XjV5Yoxn3S
+W9uIeLgjHh9hA6cUkQd8MLSFbp9ku4qk5x+7FNO9eGR2jgvgb1U/pM7jw7H7CAoH
+hf2YkEKrAczkBS/267kzFlBXQC+51NG0mEyvFurba1m7GrqcEUAA/mgNCNzt8wlL
+vZtphLdmIHfpmQARAQABiQI2BBgBCAAgFiEE4sRUF77VwQQVTzQQhbrLWu+uMgIF
+Al1vdWMCGwwACgkQhbrLWu+uMgLNKQ//aK7akPLmAlcbBuiGq1C/jHl2x7GNsZus
+KPo5QwSL3LHWgVd0ZAe6nO5RWcwPAD+oPEB/EE69h2lS+RPcewa6QmElGv+Ydnkd
+0JIQvR689fdsV1bEK2Wpyma7+bM30Rm2t9ei2nOXVAHc0b04m4f2n7XWh8+yMlwC
+qPyucOB5VSQm5bbfJI0RJ74+MDWE9yHWeRRY26N6O9davqlU3uu4hGWolyPpRZ+B
+tjcoqech3yKGxo8oQdmuPAqq7TJ6PEwlqVtGQNyF7M65g/deeSjTa6m/T4Y1i0op
+IZLWbjpPnwxVK7n2eZfudPQra6/Rn8s9vXtJyFQD7uSDbuMDJMzovKsnaWRaRORq
+g8DMVn2FQMU7WYuBfpxFG7329IdQY32nve0BMB960djcdy5tZkHC1Xcv/fQ0TPM+
+jjrWPFpaRuIY6CaSnBIXJquPMUI3OgA1lhfQNgL6g7opH+aI2RIo+45gWcnFz3du
++NgNJ1Tye6XBY5xUxoebjbwKcldH0IRYQkGBqHMrkH4bQmLPHwuhmJYybCxxw96O
+1Nq4PFra6MEq4gkLHr/l7Nw+dcYc4mJtzL5q7tOA6YGSjrKCxUOVmKbGD5GjLlNl
+iDiNryfXiNtbbRW/XZ3sWEfuzWSZkjgfYPYHyE14iFATzmfOSWSU8C4Nt+Tsazvr
+OKMsp9EJ7so=
+=/Nn9
+-END PGP PUBLIC KEY BLOCK-




[flink] branch release-1.9 updated: [hotfix][python] Fix the package name of PythonGatewayServer (#9351)

2019-08-04 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch release-1.9
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.9 by this push:
 new 078a69d  [hotfix][python] Fix the package name of PythonGatewayServer 
(#9351)
078a69d is described below

commit 078a69d232f47353b4c7728fc995b3bf8b97f820
Author: dianfu 
AuthorDate: Sun Aug 4 14:05:49 2019 +0800

[hotfix][python] Fix the package name of PythonGatewayServer (#9351)
---
 .../src/main/java/org/apache/flink/client/cli/ProgramOptions.java   | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git 
a/flink-clients/src/main/java/org/apache/flink/client/cli/ProgramOptions.java 
b/flink-clients/src/main/java/org/apache/flink/client/cli/ProgramOptions.java
index ca2ad1a..ff0114c 100644
--- 
a/flink-clients/src/main/java/org/apache/flink/client/cli/ProgramOptions.java
+++ 
b/flink-clients/src/main/java/org/apache/flink/client/cli/ProgramOptions.java
@@ -81,7 +81,7 @@ public abstract class ProgramOptions extends 
CommandLineOptions {
line.getOptionValue(CLASS_OPTION.getOpt()) : null;
 
isPython = line.hasOption(PY_OPTION.getOpt()) | 
line.hasOption(PYMODULE_OPTION.getOpt())
-   | 
"org.apache.flink.python.client.PythonGatewayServer".equals(entryPointClass);
+   | 
"org.apache.flink.client.python.PythonGatewayServer".equals(entryPointClass);
// If specified the option -py(--python)
if (line.hasOption(PY_OPTION.getOpt())) {
// Cannot use option -py and -pym simultaneously.



[flink] branch master updated: [hotfix][python] Fix the package name of PythonGatewayServer (#9351)

2019-08-04 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 2c587b6  [hotfix][python] Fix the package name of PythonGatewayServer 
(#9351)
2c587b6 is described below

commit 2c587b6a3c88a1f9f69ce07100b0775ef3a50c1f
Author: dianfu 
AuthorDate: Sun Aug 4 14:05:49 2019 +0800

[hotfix][python] Fix the package name of PythonGatewayServer (#9351)
---
 .../src/main/java/org/apache/flink/client/cli/ProgramOptions.java   | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git 
a/flink-clients/src/main/java/org/apache/flink/client/cli/ProgramOptions.java 
b/flink-clients/src/main/java/org/apache/flink/client/cli/ProgramOptions.java
index ca2ad1a..ff0114c 100644
--- 
a/flink-clients/src/main/java/org/apache/flink/client/cli/ProgramOptions.java
+++ 
b/flink-clients/src/main/java/org/apache/flink/client/cli/ProgramOptions.java
@@ -81,7 +81,7 @@ public abstract class ProgramOptions extends 
CommandLineOptions {
line.getOptionValue(CLASS_OPTION.getOpt()) : null;
 
isPython = line.hasOption(PY_OPTION.getOpt()) | 
line.hasOption(PYMODULE_OPTION.getOpt())
-   | 
"org.apache.flink.python.client.PythonGatewayServer".equals(entryPointClass);
+   | 
"org.apache.flink.client.python.PythonGatewayServer".equals(entryPointClass);
// If specified the option -py(--python)
if (line.hasOption(PY_OPTION.getOpt())) {
// Cannot use option -py and -pym simultaneously.



[flink] branch release-1.9 updated: [FLINK-12704][python] Enable the configuration of using blink planner.

2019-08-01 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch release-1.9
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.9 by this push:
 new e5a9904  [FLINK-12704][python] Enable the configuration of using blink 
planner.
e5a9904 is described below

commit e5a99049daf9cc53b2e6957157c5901448d1ad46
Author: Wei Zhong 
AuthorDate: Thu Aug 1 17:15:29 2019 +0800

[FLINK-12704][python] Enable the configuration of using blink planner.

This closes #9314
---
 flink-python/pyflink/table/__init__.py |   4 +
 flink-python/pyflink/table/environment_settings.py | 199 +
 flink-python/pyflink/table/table_environment.py| 141 ---
 .../table/tests/test_environment_settings.py   | 133 ++
 .../test_environment_settings_completeness.py  |  67 +++
 .../table/tests/test_table_environment_api.py  |  92 +-
 flink-python/pyflink/testing/test_case_utils.py|  19 ++
 7 files changed, 626 insertions(+), 29 deletions(-)

diff --git a/flink-python/pyflink/table/__init__.py 
b/flink-python/pyflink/table/__init__.py
index 48a150e..e69a9b7 100644
--- a/flink-python/pyflink/table/__init__.py
+++ b/flink-python/pyflink/table/__init__.py
@@ -27,6 +27,8 @@ Important classes of Flink Table API:
 - :class:`pyflink.table.TableConfig`
   A config to define the runtime behavior of the Table API.
   It is necessary when creating :class:`TableEnvironment`.
+- :class:`pyflink.table.EnvironmentSettings`
+  Defines all parameters that initialize a table environment.
 - :class:`pyflink.table.StreamQueryConfig` and 
:class:`pyflink.table.BatchQueryConfig`
   A query config holds parameters to configure the behavior of queries.
 - :class:`pyflink.table.TableSource`
@@ -53,6 +55,7 @@ Important classes of Flink Table API:
 """
 from __future__ import absolute_import
 
+from pyflink.table.environment_settings import EnvironmentSettings
 from pyflink.table.table import Table, GroupedTable, GroupWindowedTable, 
OverWindowedTable, \
 WindowGroupedTable
 from pyflink.table.table_config import TableConfig
@@ -67,6 +70,7 @@ __all__ = [
 'TableEnvironment',
 'StreamTableEnvironment',
 'BatchTableEnvironment',
+'EnvironmentSettings',
 'Table',
 'GroupedTable',
 'GroupWindowedTable',
diff --git a/flink-python/pyflink/table/environment_settings.py 
b/flink-python/pyflink/table/environment_settings.py
new file mode 100644
index 000..d6fda40
--- /dev/null
+++ b/flink-python/pyflink/table/environment_settings.py
@@ -0,0 +1,199 @@
+
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#  http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+
+from pyflink.java_gateway import get_gateway
+
+__all__ = ['EnvironmentSettings']
+
+
+class EnvironmentSettings(object):
+"""
+Defines all parameters that initialize a table environment. Those 
parameters are used only
+during instantiation of a :class:`~pyflink.table.TableEnvironment` and 
cannot be changed
+afterwards.
+
+Example:
+::
+
+>>> EnvironmentSettings.new_instance() \\
+... .use_old_planner() \\
+... .in_streaming_mode() \\
+... .with_built_in_catalog_name("my_catalog") \\
+... .with_built_in_database_name("my_database") \\
+... .build()
+"""
+
+class Builder(object):
+"""
+A builder for :class:`EnvironmentSettings`.
+"""
+
+def __init__(self):
+gateway = get_gateway()
+self._j_builder = gateway.jvm.EnvironmentSettings.Builder()
+
+def use_old_planner(self):
+"""
+Sets the old Flink planner as the required module.
+
+This is the default behavior.
+
+:return: This object.
+:rtype: EnvironmentSettings.Builder
+  

[flink] branch master updated: [FLINK-12704][python] Enable the configuration of using blink planner.

2019-08-01 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new da3eb2e  [FLINK-12704][python] Enable the configuration of using blink 
planner.
da3eb2e is described below

commit da3eb2e07e7c9a2aeda2c3bef803624349ae709a
Author: Wei Zhong 
AuthorDate: Thu Aug 1 17:15:29 2019 +0800

[FLINK-12704][python] Enable the configuration of using blink planner.

This closes #9314
---
 flink-python/pyflink/table/__init__.py |   4 +
 flink-python/pyflink/table/environment_settings.py | 199 +
 flink-python/pyflink/table/table_environment.py| 141 ---
 .../table/tests/test_environment_settings.py   | 133 ++
 .../test_environment_settings_completeness.py  |  67 +++
 .../table/tests/test_table_environment_api.py  |  92 +-
 flink-python/pyflink/testing/test_case_utils.py|  19 ++
 7 files changed, 626 insertions(+), 29 deletions(-)

diff --git a/flink-python/pyflink/table/__init__.py 
b/flink-python/pyflink/table/__init__.py
index 48a150e..e69a9b7 100644
--- a/flink-python/pyflink/table/__init__.py
+++ b/flink-python/pyflink/table/__init__.py
@@ -27,6 +27,8 @@ Important classes of Flink Table API:
 - :class:`pyflink.table.TableConfig`
   A config to define the runtime behavior of the Table API.
   It is necessary when creating :class:`TableEnvironment`.
+- :class:`pyflink.table.EnvironmentSettings`
+  Defines all parameters that initialize a table environment.
 - :class:`pyflink.table.StreamQueryConfig` and 
:class:`pyflink.table.BatchQueryConfig`
   A query config holds parameters to configure the behavior of queries.
 - :class:`pyflink.table.TableSource`
@@ -53,6 +55,7 @@ Important classes of Flink Table API:
 """
 from __future__ import absolute_import
 
+from pyflink.table.environment_settings import EnvironmentSettings
 from pyflink.table.table import Table, GroupedTable, GroupWindowedTable, 
OverWindowedTable, \
 WindowGroupedTable
 from pyflink.table.table_config import TableConfig
@@ -67,6 +70,7 @@ __all__ = [
 'TableEnvironment',
 'StreamTableEnvironment',
 'BatchTableEnvironment',
+'EnvironmentSettings',
 'Table',
 'GroupedTable',
 'GroupWindowedTable',
diff --git a/flink-python/pyflink/table/environment_settings.py 
b/flink-python/pyflink/table/environment_settings.py
new file mode 100644
index 000..d6fda40
--- /dev/null
+++ b/flink-python/pyflink/table/environment_settings.py
@@ -0,0 +1,199 @@
+
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#  http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+
+from pyflink.java_gateway import get_gateway
+
+__all__ = ['EnvironmentSettings']
+
+
+class EnvironmentSettings(object):
+"""
+Defines all parameters that initialize a table environment. Those 
parameters are used only
+during instantiation of a :class:`~pyflink.table.TableEnvironment` and 
cannot be changed
+afterwards.
+
+Example:
+::
+
+>>> EnvironmentSettings.new_instance() \\
+... .use_old_planner() \\
+... .in_streaming_mode() \\
+... .with_built_in_catalog_name("my_catalog") \\
+... .with_built_in_database_name("my_database") \\
+... .build()
+"""
+
+class Builder(object):
+"""
+A builder for :class:`EnvironmentSettings`.
+"""
+
+def __init__(self):
+gateway = get_gateway()
+self._j_builder = gateway.jvm.EnvironmentSettings.Builder()
+
+def use_old_planner(self):
+"""
+Sets the old Flink planner as the required module.
+
+This is the default behavior.
+
+:return: This object.
+:rtype: EnvironmentSettings.Builder
+"""
+   

[flink] branch release-1.9 updated: [hotfix] [travis] Fix the python travis failure (#9286)

2019-07-31 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch release-1.9
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.9 by this push:
 new b8c21a0  [hotfix] [travis] Fix the python travis failure (#9286)
b8c21a0 is described below

commit b8c21a0241a03db282019eafb655d175acba6a87
Author: dianfu 
AuthorDate: Wed Jul 31 15:05:51 2019 +0800

[hotfix] [travis] Fix the python travis failure (#9286)
---
 docs/ops/cli.md| 18 --
 docs/ops/cli.zh.md | 18 --
 .../src/main/flink-bin/bin/pyflink-gateway-server.sh   |  5 ++---
 tools/travis_controller.sh |  3 ++-
 4 files changed, 20 insertions(+), 24 deletions(-)

diff --git a/docs/ops/cli.md b/docs/ops/cli.md
index 6d24f29..4e84267 100644
--- a/docs/ops/cli.md
+++ b/docs/ops/cli.md
@@ -100,40 +100,38 @@ These examples about how to submit a job in CLI.
 
 -   Run Python Table program:
 
-./bin/flink run -py examples/python/table/batch/word_count.py -j 

+./bin/flink run -py examples/python/table/batch/word_count.py
 
 -   Run Python Table program with pyFiles:
 
-./bin/flink run -py examples/python/table/batch/word_count.py -j 
 \
+./bin/flink run -py examples/python/table/batch/word_count.py \
 -pyfs 
file:///user.txt,hdfs:///$namenode_address/username.txt
 
 -   Run Python Table program with pyFiles and pyModule:
 
-./bin/flink run -pym batch.word_count -pyfs 
examples/python/table/batch -j 
+./bin/flink run -pym batch.word_count -pyfs examples/python/table/batch
 
 -   Run Python Table program with parallelism 16:
 
-./bin/flink run -p 16 -py examples/python/table/batch/word_count.py -j 

+./bin/flink run -p 16 -py examples/python/table/batch/word_count.py
 
 -   Run Python Table program with flink log output disabled:
 
-./bin/flink run -q -py examples/python/table/batch/word_count.py -j 

+./bin/flink run -q -py examples/python/table/batch/word_count.py
 
 -   Run Python Table program in detached mode:
 
-./bin/flink run -d -py examples/python/table/batch/word_count.py -j 

+./bin/flink run -d -py examples/python/table/batch/word_count.py
 
 -   Run Python Table program on a specific JobManager:
 
 ./bin/flink run -m myJMHost:8081 \
-   -py examples/python/table/batch/word_count.py \
-   -j 
+   -py examples/python/table/batch/word_count.py
 
 -   Run Python Table program using a [per-job YARN 
cluster]({{site.baseurl}}/ops/deployment/yarn_setup.html#run-a-single-flink-job-on-hadoop-yarn)
 with 2 TaskManagers:
 
 ./bin/flink run -m yarn-cluster -yn 2 \
-   -py examples/python/table/batch/word_count.py \
-   -j 
+   -py examples/python/table/batch/word_count.py
 
 
 ### Job Management Examples
diff --git a/docs/ops/cli.zh.md b/docs/ops/cli.zh.md
index 8370fd8..b8cc94a 100644
--- a/docs/ops/cli.zh.md
+++ b/docs/ops/cli.zh.md
@@ -100,40 +100,38 @@ available.
 
 -   提交一个Python Table的作业:
 
-./bin/flink run -py WordCount.py -j 
+./bin/flink run -py WordCount.py
 
 -   提交一个有多个依赖的Python Table的作业:
 
-./bin/flink run -py examples/python/table/batch/word_count.py -j 
 \
+./bin/flink run -py examples/python/table/batch/word_count.py \
 -pyfs 
file:///user.txt,hdfs:///$namenode_address/username.txt
 
 -   提交一个有多个依赖的Python Table的作业,Python作业的主入口通过pym选项指定:
 
-./bin/flink run -pym batch.word_count -pyfs 
examples/python/table/batch -j 
+./bin/flink run -pym batch.word_count -pyfs examples/python/table/batch
 
 -   提交一个指定并发度为16的Python Table的作业:
 
-./bin/flink run -p 16 -py examples/python/table/batch/word_count.py -j 

+./bin/flink run -p 16 -py examples/python/table/batch/word_count.py
 
 -   提交一个关闭flink日志输出的Python Table的作业:
 
-./bin/flink run -q -py examples/python/table/batch/word_count.py -j 

+./bin/flink run -q -py examples/python/table/batch/word_count.py
 
 -   提交一个运行在detached模式下的Python Table的作业:
 
-./bin/flink run -d -py examples/python/table/batch/word_count.py -j 

+./bin/flink run -d -py examples/python/table/batch/word_count.py
 
 -   提交一个运行在指定JobManager上的Python Table的作业:
 
 ./bin/flink run -m myJMHost:8081 \
--py examples/python/table/batch/word_count.py \
--j 
+-py examples/python/table/batch/word_count.py
 
 -   提交一个运行在有两个TaskManager的[per-job YARN 
cluster]({{site.baseurl}}/ops/deployment/yarn_setup.html#run-a-single-flink-job-on-hadoop-yarn)的Python
 Table的作业:
 
 ./bin/flink

[flink] branch master updated: [hotfix] [travis] Fix the python travis failure (#9286)

2019-07-31 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 6dac7a7  [hotfix] [travis] Fix the python travis failure (#9286)
6dac7a7 is described below

commit 6dac7a7f7fd72aba903ac339f9f6bfcd44897d39
Author: dianfu 
AuthorDate: Wed Jul 31 15:05:51 2019 +0800

[hotfix] [travis] Fix the python travis failure (#9286)
---
 docs/ops/cli.md| 18 --
 docs/ops/cli.zh.md | 18 --
 .../src/main/flink-bin/bin/pyflink-gateway-server.sh   |  5 ++---
 tools/travis_controller.sh |  3 ++-
 4 files changed, 20 insertions(+), 24 deletions(-)

diff --git a/docs/ops/cli.md b/docs/ops/cli.md
index 6d24f29..4e84267 100644
--- a/docs/ops/cli.md
+++ b/docs/ops/cli.md
@@ -100,40 +100,38 @@ These examples about how to submit a job in CLI.
 
 -   Run Python Table program:
 
-./bin/flink run -py examples/python/table/batch/word_count.py -j 

+./bin/flink run -py examples/python/table/batch/word_count.py
 
 -   Run Python Table program with pyFiles:
 
-./bin/flink run -py examples/python/table/batch/word_count.py -j 
 \
+./bin/flink run -py examples/python/table/batch/word_count.py \
 -pyfs 
file:///user.txt,hdfs:///$namenode_address/username.txt
 
 -   Run Python Table program with pyFiles and pyModule:
 
-./bin/flink run -pym batch.word_count -pyfs 
examples/python/table/batch -j 
+./bin/flink run -pym batch.word_count -pyfs examples/python/table/batch
 
 -   Run Python Table program with parallelism 16:
 
-./bin/flink run -p 16 -py examples/python/table/batch/word_count.py -j 

+./bin/flink run -p 16 -py examples/python/table/batch/word_count.py
 
 -   Run Python Table program with flink log output disabled:
 
-./bin/flink run -q -py examples/python/table/batch/word_count.py -j 

+./bin/flink run -q -py examples/python/table/batch/word_count.py
 
 -   Run Python Table program in detached mode:
 
-./bin/flink run -d -py examples/python/table/batch/word_count.py -j 

+./bin/flink run -d -py examples/python/table/batch/word_count.py
 
 -   Run Python Table program on a specific JobManager:
 
 ./bin/flink run -m myJMHost:8081 \
-   -py examples/python/table/batch/word_count.py \
-   -j 
+   -py examples/python/table/batch/word_count.py
 
 -   Run Python Table program using a [per-job YARN 
cluster]({{site.baseurl}}/ops/deployment/yarn_setup.html#run-a-single-flink-job-on-hadoop-yarn)
 with 2 TaskManagers:
 
 ./bin/flink run -m yarn-cluster -yn 2 \
-   -py examples/python/table/batch/word_count.py \
-   -j 
+   -py examples/python/table/batch/word_count.py
 
 
 ### Job Management Examples
diff --git a/docs/ops/cli.zh.md b/docs/ops/cli.zh.md
index 8370fd8..b8cc94a 100644
--- a/docs/ops/cli.zh.md
+++ b/docs/ops/cli.zh.md
@@ -100,40 +100,38 @@ available.
 
 -   提交一个Python Table的作业:
 
-./bin/flink run -py WordCount.py -j 
+./bin/flink run -py WordCount.py
 
 -   提交一个有多个依赖的Python Table的作业:
 
-./bin/flink run -py examples/python/table/batch/word_count.py -j 
 \
+./bin/flink run -py examples/python/table/batch/word_count.py \
 -pyfs 
file:///user.txt,hdfs:///$namenode_address/username.txt
 
 -   提交一个有多个依赖的Python Table的作业,Python作业的主入口通过pym选项指定:
 
-./bin/flink run -pym batch.word_count -pyfs 
examples/python/table/batch -j 
+./bin/flink run -pym batch.word_count -pyfs examples/python/table/batch
 
 -   提交一个指定并发度为16的Python Table的作业:
 
-./bin/flink run -p 16 -py examples/python/table/batch/word_count.py -j 

+./bin/flink run -p 16 -py examples/python/table/batch/word_count.py
 
 -   提交一个关闭flink日志输出的Python Table的作业:
 
-./bin/flink run -q -py examples/python/table/batch/word_count.py -j 

+./bin/flink run -q -py examples/python/table/batch/word_count.py
 
 -   提交一个运行在detached模式下的Python Table的作业:
 
-./bin/flink run -d -py examples/python/table/batch/word_count.py -j 

+./bin/flink run -d -py examples/python/table/batch/word_count.py
 
 -   提交一个运行在指定JobManager上的Python Table的作业:
 
 ./bin/flink run -m myJMHost:8081 \
--py examples/python/table/batch/word_count.py \
--j 
+-py examples/python/table/batch/word_count.py
 
 -   提交一个运行在有两个TaskManager的[per-job YARN 
cluster]({{site.baseurl}}/ops/deployment/yarn_setup.html#run-a-single-flink-job-on-hadoop-yarn)的Python
 Table的作业:
 
 ./bin/flink run -m yarn

[flink] branch release-1.9 updated: [FLINK-13488][Python] Remove python 3.3/3.4 support (#9278)

2019-07-30 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch release-1.9
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.9 by this push:
 new 8751899  [FLINK-13488][Python] Remove python 3.3/3.4 support (#9278)
8751899 is described below

commit 8751899a6e055e8e82203951d3c85a7e5af75f9c
Author: Jincheng Sun 
AuthorDate: Tue Jul 30 15:05:25 2019 +0200

[FLINK-13488][Python] Remove python 3.3/3.4 support (#9278)
---
 flink-python/dev/lint-python.sh | 2 +-
 flink-python/setup.py   | 2 --
 flink-python/tox.ini| 2 +-
 3 files changed, 2 insertions(+), 4 deletions(-)

diff --git a/flink-python/dev/lint-python.sh b/flink-python/dev/lint-python.sh
index b4f836b..8aee772 100755
--- a/flink-python/dev/lint-python.sh
+++ b/flink-python/dev/lint-python.sh
@@ -178,7 +178,7 @@ function install_miniconda() {
 
 # Install some kinds of py env.
 function install_py_env() {
-py_env=("2.7" "3.3" "3.4" "3.5" "3.6" "3.7")
+py_env=("2.7" "3.5" "3.6" "3.7")
 for ((i=0;i<${#py_env[@]};i++)) do
 if [ -d "$CURRENT_DIR/.conda/envs/${py_env[i]}" ]; then
 rm -rf "$CURRENT_DIR/.conda/envs/${py_env[i]}"
diff --git a/flink-python/setup.py b/flink-python/setup.py
index 2b417f1..4854bfa 100644
--- a/flink-python/setup.py
+++ b/flink-python/setup.py
@@ -192,8 +192,6 @@ run sdist.
 'Development Status :: 1 - Planning',
 'License :: OSI Approved :: Apache Software License',
 'Programming Language :: Python :: 2.7',
-'Programming Language :: Python :: 3.3',
-'Programming Language :: Python :: 3.4',
 'Programming Language :: Python :: 3.5',
 'Programming Language :: Python :: 3.6',
 'Programming Language :: Python :: 3.7']
diff --git a/flink-python/tox.ini b/flink-python/tox.ini
index e0a6c29..baa8621 100644
--- a/flink-python/tox.ini
+++ b/flink-python/tox.ini
@@ -21,7 +21,7 @@
 # in multiple virtualenvs. This configuration file will run the
 # test suite on all supported python versions.
 # new environments will be excluded by default unless explicitly added to 
envlist.
-envlist = py27, py33, py34, py35, py36, py37
+envlist = py27, py35, py36, py37
 
 [testenv]
 whitelist_externals=



[flink] branch master updated (305051c -> 536d321)

2019-07-30 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 305051c  [FLINK-12249][table] Fix type equivalence check problems for 
Window Aggregates
 add 536d321  [FLINK-13488][Python] Remove python 3.3/3.4 support (#9278)

No new revisions were added by this update.

Summary of changes:
 flink-python/dev/lint-python.sh | 2 +-
 flink-python/setup.py   | 2 --
 flink-python/tox.ini| 2 +-
 3 files changed, 2 insertions(+), 4 deletions(-)



[flink] branch release-1.9 updated: [hotfix][python] Change "flink-python-" to "flink-python" for the change of artifact of flink-python module (#9270)

2019-07-30 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch release-1.9
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.9 by this push:
 new 8405bef  [hotfix][python] Change "flink-python-" to "flink-python" for 
the change  of artifact of flink-python module (#9270)
8405bef is described below

commit 8405bef90214fe10f17e2e59db804963a34c7440
Author: HuangXingBo 
AuthorDate: Tue Jul 30 15:34:33 2019 +0800

[hotfix][python] Change "flink-python-" to "flink-python" for the change  
of artifact of flink-python module (#9270)
---
 .../src/main/java/org/apache/flink/client/program/PackagedProgram.java  | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git 
a/flink-clients/src/main/java/org/apache/flink/client/program/PackagedProgram.java
 
b/flink-clients/src/main/java/org/apache/flink/client/program/PackagedProgram.java
index 648709d..629a147 100644
--- 
a/flink-clients/src/main/java/org/apache/flink/client/program/PackagedProgram.java
+++ 
b/flink-clients/src/main/java/org/apache/flink/client/program/PackagedProgram.java
@@ -485,7 +485,7 @@ public class PackagedProgram {
@Override
public FileVisitResult visitFile(Path 
file, BasicFileAttributes attrs) throws IOException {
FileVisitResult result = 
super.visitFile(file, attrs);
-   if 
(file.getFileName().toString().startsWith("flink-python-")) {
+   if 
(file.getFileName().toString().startsWith("flink-python")) {
pythonJarPath.add(file);
}
return result;



[flink] branch master updated: [hotfix][python] Change "flink-python-" to "flink-python" for the change of artifact of flink-python module (#9270)

2019-07-30 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 589fd4d  [hotfix][python] Change "flink-python-" to "flink-python" for 
the change  of artifact of flink-python module (#9270)
589fd4d is described below

commit 589fd4d8700bf54170cdc01a588f2b654cd1a7b4
Author: HuangXingBo 
AuthorDate: Tue Jul 30 15:34:33 2019 +0800

[hotfix][python] Change "flink-python-" to "flink-python" for the change  
of artifact of flink-python module (#9270)
---
 .../src/main/java/org/apache/flink/client/program/PackagedProgram.java  | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git 
a/flink-clients/src/main/java/org/apache/flink/client/program/PackagedProgram.java
 
b/flink-clients/src/main/java/org/apache/flink/client/program/PackagedProgram.java
index 648709d..629a147 100644
--- 
a/flink-clients/src/main/java/org/apache/flink/client/program/PackagedProgram.java
+++ 
b/flink-clients/src/main/java/org/apache/flink/client/program/PackagedProgram.java
@@ -485,7 +485,7 @@ public class PackagedProgram {
@Override
public FileVisitResult visitFile(Path 
file, BasicFileAttributes attrs) throws IOException {
FileVisitResult result = 
super.visitFile(file, attrs);
-   if 
(file.getFileName().toString().startsWith("flink-python-")) {
+   if 
(file.getFileName().toString().startsWith("flink-python")) {
pythonJarPath.add(file);
}
return result;



[flink] branch release-1.9 updated: [FLINK-13409][python] Supported java UDFs in python API.

2019-07-25 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch release-1.9
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.9 by this push:
 new cae83e5  [FLINK-13409][python] Supported java UDFs in python API.
cae83e5 is described below

commit cae83e53ed47f48e81c70dc7d7fcee6497d7aecd
Author: Wei Zhong 
AuthorDate: Thu Jul 25 10:03:19 2019 +0800

[FLINK-13409][python] Supported java UDFs in python API.

This closes #9222
---
 docs/dev/table/tableApi.md |  22 +++-
 docs/dev/table/tableApi.zh.md  |  22 +++-
 docs/dev/table/udfs.md | 135 +
 docs/dev/table/udfs.zh.md  | 135 +
 flink-python/pyflink/table/table.py|  51 
 flink-python/pyflink/table/table_environment.py|  33 +
 flink-python/pyflink/table/tests/test_correlate.py |  71 +++
 .../table/tests/test_environment_completeness.py   |   3 +-
 .../table/tests/test_table_environment_api.py  |  26 
 .../java/org/apache/flink/table/api/Table.java |   8 +-
 10 files changed, 496 insertions(+), 10 deletions(-)

diff --git a/docs/dev/table/tableApi.md b/docs/dev/table/tableApi.md
index 2a90811..36a9942 100644
--- a/docs/dev/table/tableApi.md
+++ b/docs/dev/table/tableApi.md
@@ -1336,7 +1336,16 @@ full_outer_result = left.full_outer_join(right, "a = 
d").select("a, b, e")
 Batch Streaming
   

-Currently not supported in python API.
+Joins a table with the results of a table function. Each row of the 
left (outer) table is joined with all rows produced by the corresponding call 
of the table function. A row of the left (outer) table is dropped, if its table 
function call returns an empty result.
+
+{% highlight python %}
+# register Java User-Defined Table Function
+table_env.register_java_function("split", "com.my.udf.MySplitUDTF")
+
+# join
+orders = table_env.scan("Orders")
+result = orders.join_lateral("split(c).as(s, t, v)").select("a, b, s, t, v")
+{% endhighlight %}
   
 
 
@@ -1345,7 +1354,16 @@ full_outer_result = left.full_outer_join(right, "a = 
d").select("a, b, e")
 Batch Streaming
   
   
-Currently not supported in python API.
+Joins a table with the results of a table function. Each row of the 
left (outer) table is joined with all rows produced by the corresponding call 
of the table function. If a table function call returns an empty result, the 
corresponding outer row is preserved and the result padded with null values.
+Note: Currently, the predicate of a table function left 
outer join can only be empty or literal true.
+{% highlight python %}
+# register Java User-Defined Table Function
+table_env.register_java_function("split", "com.my.udf.MySplitUDTF")
+
+# join
+orders = table_env.scan("Orders")
+result = orders.left_outer_join_lateral("split(c).as(s, t, v)").select("a, b, 
s, t, v")
+{% endhighlight %}
   
 
 
diff --git a/docs/dev/table/tableApi.zh.md b/docs/dev/table/tableApi.zh.md
index 729a531..28802a6 100644
--- a/docs/dev/table/tableApi.zh.md
+++ b/docs/dev/table/tableApi.zh.md
@@ -1335,7 +1335,16 @@ full_outer_result = left.full_outer_join(right, "a = 
d").select("a, b, e")
 批处理 流处理
   

-Python API暂不支持。
+
将一张表与一个表函数的执行结果执行内连接操作。左表的每一行都会进行一次表函数调用,调用将会返回0个,1个或多个结果,再与这些结果执行连接操作。如果一行数据对应的表函数调用返回了一个空的结果集,则这行数据会被丢弃。
+
+{% highlight python %}
+# register Java User-Defined Table Function
+table_env.register_java_function("split", "com.my.udf.MySplitUDTF")
+
+# join
+orders = table_env.scan("Orders")
+result = orders.join_lateral("split(c).as(s, t, v)").select("a, b, s, t, v")
+{% endhighlight %}
   
 
 
@@ -1344,7 +1353,16 @@ full_outer_result = left.full_outer_join(right, "a = 
d").select("a, b, e")
 批处理 流处理
   
   
-Python API暂不支持。
+
将一张表与一个表函数的执行结果执行左连接操作。左表的每一行都会进行一次表函数调用,调用将会返回0个,1个或多个结果,再与这些结果执行连接操作。如果一行数据对应的表函数调用返回了一个空的结果集,这行数据依然会被保留,对应的右表数值用null(python为None)填充。
+注意:目前,表函数的左连接操作的连接条件(join predicate)只能为空或者为"true"常量。
+{% highlight python %}
+# register Java User-Defined Table Function
+table_env.register_java_function("split", "com.my.udf.MySplitUDTF")
+
+# join
+orders = table_env.scan("Orders")
+result = orders.left_outer_join_lateral("split(c).as(s, t, v)").select("a, b, 
s, t, v")
+{% endhighlight %}
   
 
 
diff --git a/docs/dev/table/udfs.md b/docs/dev/table/udfs.md
index 87bc804..dbb9a97 100644
--- a/docs/dev/t

[flink] branch master updated: [FLINK-13409][python] Supported java UDFs in python API.

2019-07-25 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 366e916  [FLINK-13409][python] Supported java UDFs in python API.
366e916 is described below

commit 366e916e71172b1b73f802b5b4bd19ef252e68ea
Author: Wei Zhong 
AuthorDate: Thu Jul 25 10:03:19 2019 +0800

[FLINK-13409][python] Supported java UDFs in python API.

This closes #9222
---
 docs/dev/table/tableApi.md |  22 +++-
 docs/dev/table/tableApi.zh.md  |  22 +++-
 docs/dev/table/udfs.md | 135 +
 docs/dev/table/udfs.zh.md  | 135 +
 flink-python/pyflink/table/table.py|  51 
 flink-python/pyflink/table/table_environment.py|  33 +
 flink-python/pyflink/table/tests/test_correlate.py |  71 +++
 .../table/tests/test_environment_completeness.py   |   3 +-
 .../table/tests/test_table_environment_api.py  |  26 
 .../java/org/apache/flink/table/api/Table.java |   8 +-
 10 files changed, 496 insertions(+), 10 deletions(-)

diff --git a/docs/dev/table/tableApi.md b/docs/dev/table/tableApi.md
index 2a90811..36a9942 100644
--- a/docs/dev/table/tableApi.md
+++ b/docs/dev/table/tableApi.md
@@ -1336,7 +1336,16 @@ full_outer_result = left.full_outer_join(right, "a = 
d").select("a, b, e")
 Batch Streaming
   

-Currently not supported in python API.
+Joins a table with the results of a table function. Each row of the 
left (outer) table is joined with all rows produced by the corresponding call 
of the table function. A row of the left (outer) table is dropped, if its table 
function call returns an empty result.
+
+{% highlight python %}
+# register Java User-Defined Table Function
+table_env.register_java_function("split", "com.my.udf.MySplitUDTF")
+
+# join
+orders = table_env.scan("Orders")
+result = orders.join_lateral("split(c).as(s, t, v)").select("a, b, s, t, v")
+{% endhighlight %}
   
 
 
@@ -1345,7 +1354,16 @@ full_outer_result = left.full_outer_join(right, "a = 
d").select("a, b, e")
 Batch Streaming
   
   
-Currently not supported in python API.
+Joins a table with the results of a table function. Each row of the 
left (outer) table is joined with all rows produced by the corresponding call 
of the table function. If a table function call returns an empty result, the 
corresponding outer row is preserved and the result padded with null values.
+Note: Currently, the predicate of a table function left 
outer join can only be empty or literal true.
+{% highlight python %}
+# register Java User-Defined Table Function
+table_env.register_java_function("split", "com.my.udf.MySplitUDTF")
+
+# join
+orders = table_env.scan("Orders")
+result = orders.left_outer_join_lateral("split(c).as(s, t, v)").select("a, b, 
s, t, v")
+{% endhighlight %}
   
 
 
diff --git a/docs/dev/table/tableApi.zh.md b/docs/dev/table/tableApi.zh.md
index 729a531..28802a6 100644
--- a/docs/dev/table/tableApi.zh.md
+++ b/docs/dev/table/tableApi.zh.md
@@ -1335,7 +1335,16 @@ full_outer_result = left.full_outer_join(right, "a = 
d").select("a, b, e")
 批处理 流处理
   

-Python API暂不支持。
+
将一张表与一个表函数的执行结果执行内连接操作。左表的每一行都会进行一次表函数调用,调用将会返回0个,1个或多个结果,再与这些结果执行连接操作。如果一行数据对应的表函数调用返回了一个空的结果集,则这行数据会被丢弃。
+
+{% highlight python %}
+# register Java User-Defined Table Function
+table_env.register_java_function("split", "com.my.udf.MySplitUDTF")
+
+# join
+orders = table_env.scan("Orders")
+result = orders.join_lateral("split(c).as(s, t, v)").select("a, b, s, t, v")
+{% endhighlight %}
   
 
 
@@ -1344,7 +1353,16 @@ full_outer_result = left.full_outer_join(right, "a = 
d").select("a, b, e")
 批处理 流处理
   
   
-Python API暂不支持。
+
将一张表与一个表函数的执行结果执行左连接操作。左表的每一行都会进行一次表函数调用,调用将会返回0个,1个或多个结果,再与这些结果执行连接操作。如果一行数据对应的表函数调用返回了一个空的结果集,这行数据依然会被保留,对应的右表数值用null(python为None)填充。
+注意:目前,表函数的左连接操作的连接条件(join predicate)只能为空或者为"true"常量。
+{% highlight python %}
+# register Java User-Defined Table Function
+table_env.register_java_function("split", "com.my.udf.MySplitUDTF")
+
+# join
+orders = table_env.scan("Orders")
+result = orders.left_outer_join_lateral("split(c).as(s, t, v)").select("a, b, 
s, t, v")
+{% endhighlight %}
   
 
 
diff --git a/docs/dev/table/udfs.md b/docs/dev/table/udfs.md
index 87bc804..dbb9a97 100644
--- a/docs/dev/table/udfs.md

[flink] branch release-1.9 updated: [FLINK-13368][python] Add Configuration Class for Python Table API to Align with Java.

2019-07-23 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch release-1.9
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.9 by this push:
 new 39f4fc2  [FLINK-13368][python] Add Configuration Class for Python 
Table API to Align with Java.
39f4fc2 is described below

commit 39f4fc28fddcd3b26058ab8f7bdfc1717b63b91f
Author: Wei Zhong 
AuthorDate: Mon Jul 22 20:36:58 2019 +0800

[FLINK-13368][python] Add Configuration Class for Python Table API to Align 
with Java.

This closes #9199
---
 flink-python/pyflink/common/__init__.py|   2 +
 flink-python/pyflink/common/configuration.py   | 254 +
 .../pyflink/common/tests/test_configuration.py | 165 +
 flink-python/pyflink/table/table_config.py |  20 ++
 .../pyflink/table/tests/test_table_config.py   |  97 
 .../table/tests/test_table_config_completeness.py  |   2 +-
 .../table/tests/test_table_environment_api.py  |  38 ---
 7 files changed, 539 insertions(+), 39 deletions(-)

diff --git a/flink-python/pyflink/common/__init__.py 
b/flink-python/pyflink/common/__init__.py
index ceef3c8..ca27df7 100644
--- a/flink-python/pyflink/common/__init__.py
+++ b/flink-python/pyflink/common/__init__.py
@@ -22,12 +22,14 @@ Important classes used by both Flink Streaming and Batch 
API:
 - :class:`ExecutionConfig`:
   A config to define the behavior of the program execution.
 """
+from pyflink.common.configuration import Configuration
 from pyflink.common.execution_config import ExecutionConfig
 from pyflink.common.execution_mode import ExecutionMode
 from pyflink.common.input_dependency_constraint import 
InputDependencyConstraint
 from pyflink.common.restart_strategy import RestartStrategies, 
RestartStrategyConfiguration
 
 __all__ = [
+'Configuration',
 'ExecutionConfig',
 'ExecutionMode',
 'InputDependencyConstraint',
diff --git a/flink-python/pyflink/common/configuration.py 
b/flink-python/pyflink/common/configuration.py
new file mode 100644
index 000..0adc463
--- /dev/null
+++ b/flink-python/pyflink/common/configuration.py
@@ -0,0 +1,254 @@
+
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#  http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+
+from pyflink.java_gateway import get_gateway
+
+
+class Configuration:
+"""
+Lightweight configuration object which stores key/value pairs.
+"""
+
+def __init__(self, other=None, j_configuration=None):
+"""
+Creates a new configuration.
+
+:param other: Optional, if this parameter exists, creates a new 
configuration with a
+  copy of the given configuration.
+:type other: Configuration
+:param j_configuration: Optional, the py4j java configuration object, 
if this parameter
+exists, creates a wrapper for it.
+:type j_configuration: py4j.java_gateway.JavaObject
+"""
+if j_configuration is not None:
+self._j_configuration = j_configuration
+else:
+gateway = get_gateway()
+JConfiguration = 
gateway.jvm.org.apache.flink.configuration.Configuration
+if other is not None:
+self._j_configuration = JConfiguration(other._j_configuration)
+else:
+self._j_configuration = JConfiguration()
+
+def get_string(self, key, default_value):
+"""
+Returns the value associated with the given key as a string.
+
+:param key: The key pointing to the associated value.
+:type key: str
+:param default_value: The default value which is returned in case 
there is no value
+  associated with the given key.
+:type default_value: str
+:return: The (default) value associated with the given key.
+:rtype: str
+"""
+r

[flink] branch master updated: [FLINK-13368][python] Add Configuration Class for Python Table API to Align with Java.

2019-07-23 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 2d8ba48  [FLINK-13368][python] Add Configuration Class for Python 
Table API to Align with Java.
2d8ba48 is described below

commit 2d8ba48107201a933701a5181b95da803e4055b9
Author: Wei Zhong 
AuthorDate: Mon Jul 22 20:36:58 2019 +0800

[FLINK-13368][python] Add Configuration Class for Python Table API to Align 
with Java.

This closes #9199
---
 flink-python/pyflink/common/__init__.py|   2 +
 flink-python/pyflink/common/configuration.py   | 254 +
 .../pyflink/common/tests/test_configuration.py | 165 +
 flink-python/pyflink/table/table_config.py |  20 ++
 .../pyflink/table/tests/test_table_config.py   |  97 
 .../table/tests/test_table_config_completeness.py  |   2 +-
 .../table/tests/test_table_environment_api.py  |  38 ---
 7 files changed, 539 insertions(+), 39 deletions(-)

diff --git a/flink-python/pyflink/common/__init__.py 
b/flink-python/pyflink/common/__init__.py
index ceef3c8..ca27df7 100644
--- a/flink-python/pyflink/common/__init__.py
+++ b/flink-python/pyflink/common/__init__.py
@@ -22,12 +22,14 @@ Important classes used by both Flink Streaming and Batch 
API:
 - :class:`ExecutionConfig`:
   A config to define the behavior of the program execution.
 """
+from pyflink.common.configuration import Configuration
 from pyflink.common.execution_config import ExecutionConfig
 from pyflink.common.execution_mode import ExecutionMode
 from pyflink.common.input_dependency_constraint import 
InputDependencyConstraint
 from pyflink.common.restart_strategy import RestartStrategies, 
RestartStrategyConfiguration
 
 __all__ = [
+'Configuration',
 'ExecutionConfig',
 'ExecutionMode',
 'InputDependencyConstraint',
diff --git a/flink-python/pyflink/common/configuration.py 
b/flink-python/pyflink/common/configuration.py
new file mode 100644
index 000..0adc463
--- /dev/null
+++ b/flink-python/pyflink/common/configuration.py
@@ -0,0 +1,254 @@
+
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#  http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+# limitations under the License.
+
+from pyflink.java_gateway import get_gateway
+
+
+class Configuration:
+"""
+Lightweight configuration object which stores key/value pairs.
+"""
+
+def __init__(self, other=None, j_configuration=None):
+"""
+Creates a new configuration.
+
+:param other: Optional, if this parameter exists, creates a new 
configuration with a
+  copy of the given configuration.
+:type other: Configuration
+:param j_configuration: Optional, the py4j java configuration object, 
if this parameter
+exists, creates a wrapper for it.
+:type j_configuration: py4j.java_gateway.JavaObject
+"""
+if j_configuration is not None:
+self._j_configuration = j_configuration
+else:
+gateway = get_gateway()
+JConfiguration = 
gateway.jvm.org.apache.flink.configuration.Configuration
+if other is not None:
+self._j_configuration = JConfiguration(other._j_configuration)
+else:
+self._j_configuration = JConfiguration()
+
+def get_string(self, key, default_value):
+"""
+Returns the value associated with the given key as a string.
+
+:param key: The key pointing to the associated value.
+:type key: str
+:param default_value: The default value which is returned in case 
there is no value
+  associated with the given key.
+:type default_value: str
+:return: The (default) value associated with the given key.
+:rtype: str
+"""
+return self._j_

[flink] branch release-1.9 updated: [FLINK-13299][travis][python] fix flink-python failed on Travis because of incompatible virtualenv (#9140)

2019-07-17 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch release-1.9
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.9 by this push:
 new 64eb969  [FLINK-13299][travis][python] fix flink-python failed on 
Travis because of incompatible virtualenv (#9140)
64eb969 is described below

commit 64eb96901761930454f6d9058a31e0104c33dee7
Author: WeiZhong94 <44194288+weizhon...@users.noreply.github.com>
AuthorDate: Wed Jul 17 21:50:44 2019 +0800

[FLINK-13299][travis][python] fix flink-python failed on Travis because of 
incompatible virtualenv (#9140)
---
 flink-python/dev/lint-python.sh | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)

diff --git a/flink-python/dev/lint-python.sh b/flink-python/dev/lint-python.sh
index 66de146..b4f836b 100755
--- a/flink-python/dev/lint-python.sh
+++ b/flink-python/dev/lint-python.sh
@@ -213,7 +213,9 @@ function install_tox() {
 fi
 fi
 
-$CONDA_PATH install -p $CONDA_HOME -c conda-forge tox -y -q 2>&1 >/dev/null
+# virtualenv 16.6.2 released in 2019-07-14 is incompatible with py27 and 
py34,
+# force install an older version(16.0.0) to avoid this problem.
+$CONDA_PATH install -p $CONDA_HOME -c conda-forge virtualenv=16.0.0 tox -y 
-q 2>&1 >/dev/null
 if [ $? -ne 0 ]; then
 echo "conda install tox failed \
 please try to exec the script again.\



[flink] branch master updated: [FLINK-13299][travis][python] fix flink-python failed on Travis because of incompatible virtualenv (#9140)

2019-07-17 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 200a5bf  [FLINK-13299][travis][python] fix flink-python failed on 
Travis because of incompatible virtualenv (#9140)
200a5bf is described below

commit 200a5bf9dca9d398cf07879d4d1e407a2f41d839
Author: WeiZhong94 <44194288+weizhon...@users.noreply.github.com>
AuthorDate: Wed Jul 17 21:50:44 2019 +0800

[FLINK-13299][travis][python] fix flink-python failed on Travis because of 
incompatible virtualenv (#9140)
---
 flink-python/dev/lint-python.sh | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)

diff --git a/flink-python/dev/lint-python.sh b/flink-python/dev/lint-python.sh
index 66de146..b4f836b 100755
--- a/flink-python/dev/lint-python.sh
+++ b/flink-python/dev/lint-python.sh
@@ -213,7 +213,9 @@ function install_tox() {
 fi
 fi
 
-$CONDA_PATH install -p $CONDA_HOME -c conda-forge tox -y -q 2>&1 >/dev/null
+# virtualenv 16.6.2 released in 2019-07-14 is incompatible with py27 and 
py34,
+# force install an older version(16.0.0) to avoid this problem.
+$CONDA_PATH install -p $CONDA_HOME -c conda-forge virtualenv=16.0.0 tox -y 
-q 2>&1 >/dev/null
 if [ $? -ne 0 ]; then
 echo "conda install tox failed \
 please try to exec the script again.\



[flink] branch release-1.9 updated: [FLINK-13263] [python] Supports explain DAG plan in flink-python

2019-07-15 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch release-1.9
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.9 by this push:
 new 4497072  [FLINK-13263] [python] Supports explain DAG plan in 
flink-python
4497072 is described below

commit 4497072f7baae228c506385fb9f1c2c0450ef55c
Author: godfreyhe 
AuthorDate: Mon Jul 15 18:09:31 2019 +0800

[FLINK-13263] [python] Supports explain DAG plan in flink-python

This closes #9114
---
 flink-python/pyflink/table/table_environment.py| 14 +++--
 .../table/tests/test_table_environment_api.py  | 69 +-
 flink-python/pyflink/util/exceptions.py|  8 +++
 3 files changed, 85 insertions(+), 6 deletions(-)

diff --git a/flink-python/pyflink/table/table_environment.py 
b/flink-python/pyflink/table/table_environment.py
index b5619e8..071c1f5 100644
--- a/flink-python/pyflink/table/table_environment.py
+++ b/flink-python/pyflink/table/table_environment.py
@@ -224,15 +224,21 @@ class TableEnvironment(object):
 j_table_name_array = self._j_tenv.listTables()
 return [item for item in j_table_name_array]
 
-def explain(self, table):
+def explain(self, table=None, extended=False):
 """
 Returns the AST of the specified Table API and SQL queries and the 
execution plan to compute
-the result of the given :class:`Table`.
+the result of the given :class:`Table` or multi-sinks plan.
 
-:param table: The table to be explained.
+:param table: The table to be explained. If table is None, explain for 
multi-sinks plan,
+  else for given table.
+:param extended: If the plan should contain additional properties.
+ e.g. estimated cost, traits
 :return: The table for which the AST and execution plan will be 
returned.
 """
-return self._j_tenv.explain(table._j_table)
+if table is None:
+return self._j_tenv.explain(extended)
+else:
+return self._j_tenv.explain(table._j_table, extended)
 
 def sql_query(self, query):
 """
diff --git a/flink-python/pyflink/table/tests/test_table_environment_api.py 
b/flink-python/pyflink/table/tests/test_table_environment_api.py
index 54188ff..ef787f8 100644
--- a/flink-python/pyflink/table/tests/test_table_environment_api.py
+++ b/flink-python/pyflink/table/tests/test_table_environment_api.py
@@ -22,11 +22,13 @@ from py4j.compat import unicode
 
 from pyflink.dataset import ExecutionEnvironment
 from pyflink.datastream import StreamExecutionEnvironment
-from pyflink.table.table_environment import BatchTableEnvironment, 
StreamTableEnvironment
+from pyflink.table import DataTypes, CsvTableSink, StreamTableEnvironment
 from pyflink.table.table_config import TableConfig
-from pyflink.table.types import DataTypes, RowType
+from pyflink.table.table_environment import BatchTableEnvironment
+from pyflink.table.types import RowType
 from pyflink.testing import source_sink_utils
 from pyflink.testing.test_case_utils import PyFlinkStreamTableTestCase, 
PyFlinkBatchTableTestCase
+from pyflink.util.exceptions import TableException
 
 
 class StreamTableEnvironmentTests(PyFlinkStreamTableTestCase):
@@ -103,6 +105,38 @@ class 
StreamTableEnvironmentTests(PyFlinkStreamTableTestCase):
 
 assert isinstance(actual, str) or isinstance(actual, unicode)
 
+def test_explain_with_extended(self):
+schema = RowType() \
+.add('a', DataTypes.INT()) \
+.add('b', DataTypes.STRING()) \
+.add('c', DataTypes.STRING())
+t_env = self.t_env
+t = t_env.from_elements([], schema)
+result = t.select("1 + a, b, c")
+
+actual = t_env.explain(result, True)
+
+assert isinstance(actual, str) or isinstance(actual, unicode)
+
+def test_explain_with_multi_sinks(self):
+t_env = self.t_env
+source = t_env.from_elements([(1, "Hi", "Hello"), (2, "Hello", 
"Hello")], ["a", "b", "c"])
+field_names = ["a", "b", "c"]
+field_types = [DataTypes.BIGINT(), DataTypes.STRING(), 
DataTypes.STRING()]
+t_env.register_table_sink(
+"sink1",
+source_sink_utils.TestAppendSink(field_names, field_types))
+t_env.register_table_sink(
+"sink2",
+source_sink_utils.TestAppendSink(field_names, field_types))
+
+t_env.sql_update("insert into sink1 select * from %s where a > 100" % 
source)
+t_env.sql_update("insert into sink2 select * from %s where a < 100" % 
source)
+
+actual = t_env.explain(extended=True)
+
+assert isinstance(a

[flink] branch master updated: [FLINK-13263] [python] Supports explain DAG plan in flink-python

2019-07-15 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new dce0d0a  [FLINK-13263] [python] Supports explain DAG plan in 
flink-python
dce0d0a is described below

commit dce0d0a37151400d87f0e8a1e002747da9a0a34e
Author: godfreyhe 
AuthorDate: Mon Jul 15 18:09:31 2019 +0800

[FLINK-13263] [python] Supports explain DAG plan in flink-python

This closes #9114
---
 flink-python/pyflink/table/table_environment.py| 14 +++--
 .../table/tests/test_table_environment_api.py  | 69 +-
 flink-python/pyflink/util/exceptions.py|  8 +++
 3 files changed, 85 insertions(+), 6 deletions(-)

diff --git a/flink-python/pyflink/table/table_environment.py 
b/flink-python/pyflink/table/table_environment.py
index b5619e8..071c1f5 100644
--- a/flink-python/pyflink/table/table_environment.py
+++ b/flink-python/pyflink/table/table_environment.py
@@ -224,15 +224,21 @@ class TableEnvironment(object):
 j_table_name_array = self._j_tenv.listTables()
 return [item for item in j_table_name_array]
 
-def explain(self, table):
+def explain(self, table=None, extended=False):
 """
 Returns the AST of the specified Table API and SQL queries and the 
execution plan to compute
-the result of the given :class:`Table`.
+the result of the given :class:`Table` or multi-sinks plan.
 
-:param table: The table to be explained.
+:param table: The table to be explained. If table is None, explain for 
multi-sinks plan,
+  else for given table.
+:param extended: If the plan should contain additional properties.
+ e.g. estimated cost, traits
 :return: The table for which the AST and execution plan will be 
returned.
 """
-return self._j_tenv.explain(table._j_table)
+if table is None:
+return self._j_tenv.explain(extended)
+else:
+return self._j_tenv.explain(table._j_table, extended)
 
 def sql_query(self, query):
 """
diff --git a/flink-python/pyflink/table/tests/test_table_environment_api.py 
b/flink-python/pyflink/table/tests/test_table_environment_api.py
index 54188ff..ef787f8 100644
--- a/flink-python/pyflink/table/tests/test_table_environment_api.py
+++ b/flink-python/pyflink/table/tests/test_table_environment_api.py
@@ -22,11 +22,13 @@ from py4j.compat import unicode
 
 from pyflink.dataset import ExecutionEnvironment
 from pyflink.datastream import StreamExecutionEnvironment
-from pyflink.table.table_environment import BatchTableEnvironment, 
StreamTableEnvironment
+from pyflink.table import DataTypes, CsvTableSink, StreamTableEnvironment
 from pyflink.table.table_config import TableConfig
-from pyflink.table.types import DataTypes, RowType
+from pyflink.table.table_environment import BatchTableEnvironment
+from pyflink.table.types import RowType
 from pyflink.testing import source_sink_utils
 from pyflink.testing.test_case_utils import PyFlinkStreamTableTestCase, 
PyFlinkBatchTableTestCase
+from pyflink.util.exceptions import TableException
 
 
 class StreamTableEnvironmentTests(PyFlinkStreamTableTestCase):
@@ -103,6 +105,38 @@ class 
StreamTableEnvironmentTests(PyFlinkStreamTableTestCase):
 
 assert isinstance(actual, str) or isinstance(actual, unicode)
 
+def test_explain_with_extended(self):
+schema = RowType() \
+.add('a', DataTypes.INT()) \
+.add('b', DataTypes.STRING()) \
+.add('c', DataTypes.STRING())
+t_env = self.t_env
+t = t_env.from_elements([], schema)
+result = t.select("1 + a, b, c")
+
+actual = t_env.explain(result, True)
+
+assert isinstance(actual, str) or isinstance(actual, unicode)
+
+def test_explain_with_multi_sinks(self):
+t_env = self.t_env
+source = t_env.from_elements([(1, "Hi", "Hello"), (2, "Hello", 
"Hello")], ["a", "b", "c"])
+field_names = ["a", "b", "c"]
+field_types = [DataTypes.BIGINT(), DataTypes.STRING(), 
DataTypes.STRING()]
+t_env.register_table_sink(
+"sink1",
+source_sink_utils.TestAppendSink(field_names, field_types))
+t_env.register_table_sink(
+"sink2",
+source_sink_utils.TestAppendSink(field_names, field_types))
+
+t_env.sql_update("insert into sink1 select * from %s where a > 100" % 
source)
+t_env.sql_update("insert into sink2 select * from %s where a < 100" % 
source)
+
+actual = t_env.explain(extended=True)
+
+assert isinstance(actual, str) o

[flink-web] branch asf-site updated: Rebuild website

2019-07-12 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git


The following commit(s) were added to refs/heads/asf-site by this push:
 new a22da2b  Rebuild website
a22da2b is described below

commit a22da2b6994bc1b891eb5d2d8c5fd9fed04d9931
Author: sunjincheng121 
AuthorDate: Fri Jul 12 14:18:25 2019 +0800

Rebuild website
---
 content/community.html| 2 +-
 content/zh/community.html | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/content/community.html b/content/community.html
index 91ec4d6..9a218d6 100644
--- a/content/community.html
+++ b/content/community.html
@@ -549,7 +549,7 @@
   
 https://avatars1.githubusercontent.com/u/22488084?s=50; 
class="committer-avatar" />
 Jincheng Sun
-Committer
+PMC, Committer
 jincheng
   
   
diff --git a/content/zh/community.html b/content/zh/community.html
index 4288175..3826312 100644
--- a/content/zh/community.html
+++ b/content/zh/community.html
@@ -534,7 +534,7 @@
   
 https://avatars1.githubusercontent.com/u/22488084?s=50; 
class="committer-avatar" />
 Jincheng Sun
-Committer
+PMC, Committer
 jincheng
   
   



[flink-web] branch asf-site updated: update the role info

2019-07-12 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git


The following commit(s) were added to refs/heads/asf-site by this push:
 new 5bd506c  update the role info
5bd506c is described below

commit 5bd506ca195c644e74da61b4d7151649cec3721c
Author: sunjincheng121 
AuthorDate: Fri Jul 12 14:16:57 2019 +0800

update the role info
---
 community.md| 2 +-
 community.zh.md | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/community.md b/community.md
index bd9d442..5bbe9a9 100644
--- a/community.md
+++ b/community.md
@@ -361,7 +361,7 @@ Flink Forward is a conference happening yearly in different 
locations around the
   
 https://avatars1.githubusercontent.com/u/22488084?s=50; 
class="committer-avatar">
 Jincheng Sun
-Committer
+PMC, Committer
 jincheng
   
   
diff --git a/community.zh.md b/community.zh.md
index b8d25bc..faf85de 100644
--- a/community.zh.md
+++ b/community.zh.md
@@ -351,7 +351,7 @@ Flink Forward 2015 (2015 年 10 月 12-13 日) 是第一届把 Apache Flink 
开
   
 https://avatars1.githubusercontent.com/u/22488084?s=50; 
class="committer-avatar">
 Jincheng Sun
-Committer
+PMC, Committer
 jincheng
   
   



[flink] branch master updated: [FLINK-12602][travis] Correct the flink pom `artifactId` config and s… (#8563)

2019-07-11 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 5762170  [FLINK-12602][travis] Correct the flink pom `artifactId` 
config and s… (#8563)
5762170 is described below

commit 57621703b7342442f3a0ec62315ce1cefa0a8287
Author: Jincheng Sun 
AuthorDate: Fri Jul 12 09:22:03 2019 +0800

[FLINK-12602][travis] Correct the flink pom `artifactId` config and s… 
(#8563)

 Brief change log:
  - remove the scala version suffix for connector-hive and 
queryable-state-client-java
  - add the scala dependencies for table-api-scala and flink-sql-connectors
  - correct the scala-free check logic in `verify_scala_suffixes.sh`
---
 docs/dev/stream/state/queryable_state.md| 2 +-
 docs/dev/stream/state/queryable_state.zh.md | 2 +-
 flink-connectors/flink-connector-cassandra/pom.xml  | 2 +-
 flink-connectors/flink-connector-filesystem/pom.xml | 2 +-
 flink-connectors/flink-connector-kafka-0.10/pom.xml | 2 +-
 flink-connectors/flink-connector-kafka-0.11/pom.xml | 2 +-
 flink-connectors/flink-connector-kafka-0.8/pom.xml  | 2 +-
 flink-connectors/flink-connector-kafka-0.9/pom.xml  | 2 +-
 flink-connectors/flink-connector-kafka-base/pom.xml | 2 +-
 flink-connectors/flink-connector-kafka/pom.xml  | 2 +-
 flink-connectors/flink-connector-kinesis/pom.xml| 2 +-
 flink-connectors/flink-connector-nifi/pom.xml   | 2 +-
 flink-end-to-end-tests/flink-queryable-state-test/pom.xml   | 2 +-
 flink-fs-tests/pom.xml  | 2 +-
 flink-libraries/flink-cep-scala/pom.xml | 2 +-
 flink-libraries/flink-gelly-examples/pom.xml| 2 +-
 flink-libraries/flink-gelly-scala/pom.xml   | 2 +-
 flink-libraries/flink-state-processing-api/pom.xml  | 2 +-
 flink-queryable-state/flink-queryable-state-client-java/pom.xml | 2 +-
 flink-queryable-state/flink-queryable-state-runtime/pom.xml | 2 +-
 flink-runtime/pom.xml   | 2 +-
 flink-streaming-scala/pom.xml   | 2 +-
 flink-table/flink-table-api-scala/pom.xml   | 9 -
 flink-table/flink-table-planner/pom.xml | 2 +-
 flink-tests/pom.xml | 2 +-
 flink-yarn-tests/pom.xml| 2 +-
 tools/verify_scala_suffixes.sh  | 6 +++---
 27 files changed, 36 insertions(+), 29 deletions(-)

diff --git a/docs/dev/stream/state/queryable_state.md 
b/docs/dev/stream/state/queryable_state.md
index fb14cb4..ee6b4be 100644
--- a/docs/dev/stream/state/queryable_state.md
+++ b/docs/dev/stream/state/queryable_state.md
@@ -174,7 +174,7 @@ jar which must be explicitly included as a dependency in 
the `pom.xml` of your p
 
 
   org.apache.flink
-  flink-queryable-state-client-java{{ site.scala_version_suffix 
}}
+  flink-queryable-state-client-java
   {{ site.version }}
 
 {% endhighlight %}
diff --git a/docs/dev/stream/state/queryable_state.zh.md 
b/docs/dev/stream/state/queryable_state.zh.md
index c9a16c4..a101110 100644
--- a/docs/dev/stream/state/queryable_state.zh.md
+++ b/docs/dev/stream/state/queryable_state.zh.md
@@ -174,7 +174,7 @@ jar which must be explicitly included as a dependency in 
the `pom.xml` of your p
 
 
   org.apache.flink
-  flink-queryable-state-client-java{{ site.scala_version_suffix 
}}
+  flink-queryable-state-client-java
   {{ site.version }}
 
 {% endhighlight %}
diff --git a/flink-connectors/flink-connector-cassandra/pom.xml 
b/flink-connectors/flink-connector-cassandra/pom.xml
index 0218d6e..6338b18 100644
--- a/flink-connectors/flink-connector-cassandra/pom.xml
+++ b/flink-connectors/flink-connector-cassandra/pom.xml
@@ -211,7 +211,7 @@ under the License.


org.apache.flink
-   
flink-tests_${scala.binary.version}
+   flink-tests
${project.version}
test

diff --git a/flink-connectors/flink-connector-filesystem/pom.xml 
b/flink-connectors/flink-connector-filesystem/pom.xml
index 0202986..d6af8f3 100644
--- a/flink-connectors/flink-connector-filesystem/pom.xml
+++ b/flink-connectors/flink-connector-filesystem/pom.xml
@@ -99,7 +99,7 @@ under the License.
 

org.apache.flink
-   
flink-tests_${scala.binary.version}
+   flink-tests
${project.version}
test
test-jar
diff --git

[flink] branch master updated: [hotfix][python] Use the TableEnvironment.execute() method instead of ExecutionEnvironment.execute()/StreamExecutionEnvironment.execute(). (#9087)

2019-07-11 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 1aa43fc  [hotfix][python] Use the TableEnvironment.execute() method 
instead of ExecutionEnvironment.execute()/StreamExecutionEnvironment.execute(). 
(#9087)
1aa43fc is described below

commit 1aa43fc62341b4e70676ee5be2b67bd7136856e1
Author: WeiZhong94 <44194288+weizhon...@users.noreply.github.com>
AuthorDate: Thu Jul 11 22:51:23 2019 +0800

[hotfix][python] Use the TableEnvironment.execute() method instead of 
ExecutionEnvironment.execute()/StreamExecutionEnvironment.execute(). (#9087)
---
 docs/dev/table/common.md   |  2 +-
 docs/dev/table/common.zh.md|  2 +-
 docs/dev/table/tableApi.md |  2 +-
 docs/dev/table/tableApi.zh.md  |  2 +-
 docs/ops/python_shell.md   |  4 +-
 docs/ops/python_shell.zh.md|  4 +-
 docs/tutorials/python_table_api.md |  6 +--
 docs/tutorials/python_table_api.zh.md  |  6 +--
 flink-python/pyflink/shell.py  |  4 +-
 .../pyflink/table/examples/batch/word_count.py |  2 +-
 flink-python/pyflink/table/table.py|  2 +-
 flink-python/pyflink/table/table_environment.py| 49 ++
 flink-python/pyflink/table/tests/test_calc.py  |  2 +-
 .../pyflink/table/tests/test_descriptor.py |  4 +-
 .../pyflink/table/tests/test_shell_example.py  |  4 +-
 .../table/tests/test_table_environment_api.py  |  6 +--
 16 files changed, 38 insertions(+), 63 deletions(-)

diff --git a/docs/dev/table/common.md b/docs/dev/table/common.md
index 5228ff2..c22fbc6 100644
--- a/docs/dev/table/common.md
+++ b/docs/dev/table/common.md
@@ -116,7 +116,7 @@ sql_result  = table_env.sql_query("SELECT ... FROM table2 
...")
 tapi_result.insert_into("outputTable")
 
 # execute
-env.execute()
+table_env.execute("python_job")
 
 {% endhighlight %}
 
diff --git a/docs/dev/table/common.zh.md b/docs/dev/table/common.zh.md
index 5d6f9ae..1f0d6f6 100644
--- a/docs/dev/table/common.zh.md
+++ b/docs/dev/table/common.zh.md
@@ -116,7 +116,7 @@ sql_result  = table_env.sql_query("SELECT ... FROM table2 
...")
 tapi_result.insert_into("outputTable")
 
 # execute
-env.execute()
+table_env.execute("python_job")
 
 {% endhighlight %}
 
diff --git a/docs/dev/table/tableApi.md b/docs/dev/table/tableApi.md
index 8dc33e1..c0a2842 100644
--- a/docs/dev/table/tableApi.md
+++ b/docs/dev/table/tableApi.md
@@ -119,7 +119,7 @@ orders = t_env.scan("Orders")  # schema (a, b, c, rowtime)
 
 orders.group_by("a").select("a, b.count as cnt").insert_into("result")
 
-env.execute()
+t_env.execute("python_job")
 
 {% endhighlight %}
 
diff --git a/docs/dev/table/tableApi.zh.md b/docs/dev/table/tableApi.zh.md
index 409a7b4..729a531 100644
--- a/docs/dev/table/tableApi.zh.md
+++ b/docs/dev/table/tableApi.zh.md
@@ -119,7 +119,7 @@ orders = t_env.scan("Orders")  # schema (a, b, c, rowtime)
 
 orders.group_by("a").select("a, b.count as cnt").insert_into("result")
 
-env.execute()
+t_env.execute("python_job")
 
 {% endhighlight %}
 
diff --git a/docs/ops/python_shell.md b/docs/ops/python_shell.md
index a52cd23..2e5a0cf 100644
--- a/docs/ops/python_shell.md
+++ b/docs/ops/python_shell.md
@@ -70,7 +70,7 @@ The example below is a simple program in the Python shell:
 ... .register_table_sink("stream_sink")
 >>> t.select("a + 1, b, c")\
 ... .insert_into("stream_sink")
->>> s_env.execute()
+>>> st_env.execute("stream_job")
 >>> # If the job runs in local mode, you can exec following code in Python 
 >>> shell to see the result:
 >>> with open(sink_path, 'r') as f:
 ... print(f.read())
@@ -102,7 +102,7 @@ The example below is a simple program in the Python shell:
 ... .register_table_sink("batch_sink")
 >>> t.select("a + 1, b, c")\
 ... .insert_into("batch_sink")
->>> b_env.execute()
+>>> bt_env.execute("batch_job")
 >>> # If the job runs in local mode, you can exec following code in Python 
 >>> shell to see the result:
 >>> with open(sink_path, 'r') as f:
 ... print(f.read())
diff --git a/docs/ops/python_shell.zh.md b/docs/ops/python_shell.zh.md
index 90440d8..2566c9e 100644
--- a/docs/ops/python_shell.zh.md
+++ b/docs/ops/python_shell.zh.md
@@ -69,7 +69,7 @@ bin/pyflink-shell.sh local
 ... .register_table_sink("stream_sink")
 >>> t.select("a + 1, b, c")\
 ... .insert_

[flink] branch master updated: [hotfix][python] Align with Java Table API to remove QueryConfig (#9063)

2019-07-10 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 8acc1d3  [hotfix][python] Align with Java Table API to remove 
QueryConfig (#9063)
8acc1d3 is described below

commit 8acc1d39ba4d37392aba322ebae635d213fcdd72
Author: WeiZhong94 <44194288+weizhon...@users.noreply.github.com>
AuthorDate: Wed Jul 10 17:53:54 2019 +0800

[hotfix][python] Align with Java Table API to remove QueryConfig (#9063)
---
 docs/dev/table/streaming/query_configuration.md|  14 +-
 docs/dev/table/streaming/query_configuration.zh.md |  14 +-
 flink-python/pyflink/table/__init__.py |   3 -
 flink-python/pyflink/table/query_config.py | 121 
 flink-python/pyflink/table/table_config.py | 155 -
 .../table/tests/test_table_config_completeness.py  |  58 
 .../table/tests/test_table_environment_api.py  |  28 +++-
 7 files changed, 243 insertions(+), 150 deletions(-)

diff --git a/docs/dev/table/streaming/query_configuration.md 
b/docs/dev/table/streaming/query_configuration.md
index 1e0527d..dbb18bc 100644
--- a/docs/dev/table/streaming/query_configuration.md
+++ b/docs/dev/table/streaming/query_configuration.md
@@ -91,13 +91,13 @@ val stream: DataStream[Row] = 
result.toAppendStream[Row](qConfig)
 
 
 {% highlight python %}
-env = StreamExecutionEnvironment.get_execution_environment()
-table_env = StreamTableEnvironment.create(env)
-
-# obtain query configuration from TableEnvironment
-q_config = StreamQueryConfig()
+# use TableConfig instead of QueryConfig in python API
+t_config = TableConfig()
 # set query parameters
-q_config.with_idle_state_retention_time(timedelta(hours=12), 
timedelta(hours=24))
+t_config.set_idle_state_retention_time(timedelta(hours=12), 
timedelta(hours=24))
+
+env = StreamExecutionEnvironment.get_execution_environment()
+table_env = StreamTableEnvironment.create(env, t_config)
 
 # define query
 result = ...
@@ -110,7 +110,7 @@ table_env.register_table_sink("outputTable",  # table name
   sink)  # table sink
 
 # emit result Table via a TableSink
-result.insert_into("outputTable", q_config)
+result.insert_into("outputTable")
 
 {% endhighlight %}
 
diff --git a/docs/dev/table/streaming/query_configuration.zh.md 
b/docs/dev/table/streaming/query_configuration.zh.md
index 1e0527d..dbb18bc 100644
--- a/docs/dev/table/streaming/query_configuration.zh.md
+++ b/docs/dev/table/streaming/query_configuration.zh.md
@@ -91,13 +91,13 @@ val stream: DataStream[Row] = 
result.toAppendStream[Row](qConfig)
 
 
 {% highlight python %}
-env = StreamExecutionEnvironment.get_execution_environment()
-table_env = StreamTableEnvironment.create(env)
-
-# obtain query configuration from TableEnvironment
-q_config = StreamQueryConfig()
+# use TableConfig instead of QueryConfig in python API
+t_config = TableConfig()
 # set query parameters
-q_config.with_idle_state_retention_time(timedelta(hours=12), 
timedelta(hours=24))
+t_config.set_idle_state_retention_time(timedelta(hours=12), 
timedelta(hours=24))
+
+env = StreamExecutionEnvironment.get_execution_environment()
+table_env = StreamTableEnvironment.create(env, t_config)
 
 # define query
 result = ...
@@ -110,7 +110,7 @@ table_env.register_table_sink("outputTable",  # table name
   sink)  # table sink
 
 # emit result Table via a TableSink
-result.insert_into("outputTable", q_config)
+result.insert_into("outputTable")
 
 {% endhighlight %}
 
diff --git a/flink-python/pyflink/table/__init__.py 
b/flink-python/pyflink/table/__init__.py
index ac5991d..48a150e 100644
--- a/flink-python/pyflink/table/__init__.py
+++ b/flink-python/pyflink/table/__init__.py
@@ -53,7 +53,6 @@ Important classes of Flink Table API:
 """
 from __future__ import absolute_import
 
-from pyflink.table.query_config import BatchQueryConfig, StreamQueryConfig
 from pyflink.table.table import Table, GroupedTable, GroupWindowedTable, 
OverWindowedTable, \
 WindowGroupedTable
 from pyflink.table.table_config import TableConfig
@@ -74,8 +73,6 @@ __all__ = [
 'OverWindowedTable',
 'WindowGroupedTable',
 'TableConfig',
-'StreamQueryConfig',
-'BatchQueryConfig',
 'TableSink',
 'TableSource',
 'WriteMode',
diff --git a/flink-python/pyflink/table/query_config.py 
b/flink-python/pyflink/table/query_config.py
deleted file mode 100644
index 69b6488..000
--- a/flink-python/pyflink/table/query_config.py
+++ /dev/null
@@ -1,121 +0,0 @@
-
-#  Licensed to the Apache Software Foundation (ASF) under one
-#  or more contributor license agreements.  See the NOTICE file
-#  distributed with this work for addi

[flink] branch master updated: [hotfix][python] Update the package name from pyflink to apache-flink (#9028)

2019-07-10 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 32d3737  [hotfix][python] Update the package name from pyflink to 
apache-flink (#9028)
32d3737 is described below

commit 32d373787555428c990ec53b3bdff6864cc21a69
Author: dianfu 
AuthorDate: Wed Jul 10 16:10:12 2019 +0800

[hotfix][python] Update the package name from pyflink to apache-flink 
(#9028)
---
 flink-python/setup.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/flink-python/setup.py b/flink-python/setup.py
index b87e49f..2b417f1 100644
--- a/flink-python/setup.py
+++ b/flink-python/setup.py
@@ -140,7 +140,7 @@ run sdist.
 scripts.append("pyflink/find_flink_home.py")
 
 setup(
-name='pyflink',
+name='apache-flink',
 version=VERSION,
 packages=['pyflink',
   'pyflink.table',



[flink] annotated tag release-1.8.1 created (now 2aeaef1)

2019-07-09 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a change to annotated tag release-1.8.1
in repository https://gitbox.apache.org/repos/asf/flink.git.


  at 2aeaef1  (tag)
 tagging d0bcbf0972aacd366efd4ee2741df51195999045 (tag)
  length 1000 bytes
  by sunjincheng121
  on Wed Jul 10 03:10:09 2019 +0800

- Log -
release-1.8.1
-BEGIN PGP SIGNATURE-

iQIzBAABCAAdFiEEj+oe6dAEjAzMcLdXMhGwcDt56g4FAl0k5pgACgkQMhGwcDt5
6g5zgg/+If1Cq3ceSbXX1SZMep8L3I0yGPmLFGPTF/92EJH87iPUUL8zdddDZXqE
7f+ciPNDF8sKZJfC1G/zAJ11q0c68O8j3+4Rj3wn+ERdTH9LDQpEms+uV/+WDepz
C+VGJ4GRkRgggOkfCf/Nzz9GBx+O7zXTAIoXmZe5Kkj3geO1UaQm0NQyQMQb6fdm
7ZUrJCzM3bMz0pTCQmC5FEcGFaeMF9nZ78D8cgsFjhSDn8Re7Rv9CF5SudN9Pz2v
mZllp5h9EDyCEmcY0lmitKe2HwR220Kit7VB/bGEQaWWEk5hHTFRwVi1J+Oul4CJ
XFlOE0Jd7mBP4PUgvOA3Rmf+5UUfEf2Dswscc31sZD+884PuZ/lwgMnlaltRYFS2
iZ0CdSQdPd+qbw87R20hJf11oiF+wbDotnllv3hxbxtDLyBqA34itv8KS9LFRmvv
37PcVnq1dNaGNBoG5fxox8WOeTv6mo5Y9UnscNIu0X4NTLUMhcT0YzIwuGhg6Mml
pMzy7aAIgaDvnw/GwjYrhWR3neYzuYZ4HSBHkQgj5KxU0BV+10WiRpseqU3z/IEN
v6LqK4eku5YvCWWlfq6jc/CMcfJMJ0CxKznSQ663RL48x0gKt2rJSJgm/t0EYZrB
P7j9p7cabcaVa6qefwuYKBd9ZV6wpFSeh8fWELYzRGggMz99tCA=
=MW9Z
-END PGP SIGNATURE-
---

This annotated tag includes the following new commits:

 new 7297bac  Commit for release 1.8.1

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.




[flink] 01/01: Commit for release 1.8.1

2019-07-09 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to annotated tag release-1.8.1
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 7297bacfe14b9f814c308d95003c837115a6bdd2
Author: sunjincheng121 
AuthorDate: Mon Jun 24 23:04:28 2019 +0800

Commit for release 1.8.1
---
 docs/_config.yml  | 4 ++--
 flink-annotations/pom.xml | 2 +-
 flink-clients/pom.xml | 2 +-
 flink-connectors/flink-connector-cassandra/pom.xml| 2 +-
 flink-connectors/flink-connector-elasticsearch-base/pom.xml   | 2 +-
 flink-connectors/flink-connector-elasticsearch/pom.xml| 2 +-
 flink-connectors/flink-connector-elasticsearch2/pom.xml   | 2 +-
 flink-connectors/flink-connector-elasticsearch5/pom.xml   | 2 +-
 flink-connectors/flink-connector-elasticsearch6/pom.xml   | 2 +-
 flink-connectors/flink-connector-filesystem/pom.xml   | 2 +-
 flink-connectors/flink-connector-kafka-0.10/pom.xml   | 2 +-
 flink-connectors/flink-connector-kafka-0.11/pom.xml   | 2 +-
 flink-connectors/flink-connector-kafka-0.8/pom.xml| 2 +-
 flink-connectors/flink-connector-kafka-0.9/pom.xml| 2 +-
 flink-connectors/flink-connector-kafka-base/pom.xml   | 2 +-
 flink-connectors/flink-connector-kafka/pom.xml| 2 +-
 flink-connectors/flink-connector-kinesis/pom.xml  | 2 +-
 flink-connectors/flink-connector-nifi/pom.xml | 2 +-
 flink-connectors/flink-connector-rabbitmq/pom.xml | 2 +-
 flink-connectors/flink-connector-twitter/pom.xml  | 2 +-
 flink-connectors/flink-hadoop-compatibility/pom.xml   | 2 +-
 flink-connectors/flink-hbase/pom.xml  | 2 +-
 flink-connectors/flink-hcatalog/pom.xml   | 2 +-
 flink-connectors/flink-jdbc/pom.xml   | 2 +-
 flink-connectors/flink-orc/pom.xml| 2 +-
 flink-connectors/flink-sql-connector-elasticsearch6/pom.xml   | 2 +-
 flink-connectors/flink-sql-connector-kafka-0.10/pom.xml   | 2 +-
 flink-connectors/flink-sql-connector-kafka-0.11/pom.xml   | 2 +-
 flink-connectors/flink-sql-connector-kafka-0.9/pom.xml| 2 +-
 flink-connectors/flink-sql-connector-kafka/pom.xml| 2 +-
 flink-connectors/pom.xml  | 2 +-
 flink-container/pom.xml   | 2 +-
 flink-contrib/flink-connector-wikiedits/pom.xml   | 2 +-
 flink-contrib/pom.xml | 2 +-
 flink-core/pom.xml| 2 +-
 flink-dist/pom.xml| 2 +-
 flink-docs/pom.xml| 2 +-
 flink-end-to-end-tests/flink-bucketing-sink-test/pom.xml  | 2 +-
 flink-end-to-end-tests/flink-cli-test/pom.xml | 2 +-
 flink-end-to-end-tests/flink-confluent-schema-registry/pom.xml| 2 +-
 flink-end-to-end-tests/flink-dataset-allround-test/pom.xml| 2 +-
 flink-end-to-end-tests/flink-datastream-allround-test/pom.xml | 2 +-
 flink-end-to-end-tests/flink-distributed-cache-via-blob-test/pom.xml  | 2 +-
 flink-end-to-end-tests/flink-e2e-test-utils/pom.xml   | 2 +-
 flink-end-to-end-tests/flink-elasticsearch1-test/pom.xml  | 2 +-
 flink-end-to-end-tests/flink-elasticsearch2-test/pom.xml  | 2 +-
 flink-end-to-end-tests/flink-elasticsearch5-test/pom.xml  | 2 +-
 flink-end-to-end-tests/flink-elasticsearch6-test/pom.xml  | 2 +-
 flink-end-to-end-tests/flink-end-to-end-tests-common/pom.xml  | 4 ++--
 flink-end-to-end-tests/flink-heavy-deployment-stress-test/pom.xml | 2 +-
 flink-end-to-end-tests/flink-high-parallelism-iterations-test/pom.xml | 2 +-
 .../flink-local-recovery-and-allocation-test/pom.xml  | 2 +-
 flink-end-to-end-tests/flink-metrics-availability-test/pom.xml| 2 +-
 flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/pom.xml | 2 +-
 .../flink-parent-child-classloading-test-lib-package/pom.xml  | 2 +-
 .../flink-parent-child-classloading-test-program/pom.xml  | 2 +-
 flink-end-to-end-tests/flink-queryable-state-test/pom.xml | 2 +-
 flink-end-to-end-tests/flink-quickstart-test/pom.xml  | 2 +-
 flink-end-to-end-tests/flink-sql-client-test/pom.xml  | 2 +-
 flink-end-to-end-tests/flink-state-evolution-test

[flink] branch master updated: [hotfix][python][docs]Fix wrong example code in Python REPL and wrong table plan in table/common page (#9018)

2019-07-09 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new e4f4470  [hotfix][python][docs]Fix wrong example code in Python REPL 
and wrong table plan in table/common page (#9018)
e4f4470 is described below

commit e4f44709a56c1a1cdfc2093a7914db7c3b0bbcda
Author: HuangXingBo 
AuthorDate: Tue Jul 9 14:41:49 2019 +0800

[hotfix][python][docs]Fix wrong example code in Python REPL and wrong table 
plan in table/common page (#9018)
---
 docs/dev/table/common.md  | 126 --
 docs/dev/table/common.zh.md   | 126 --
 docs/dev/table/tableApi.md|   8 +--
 docs/dev/table/tableApi.zh.md |   8 +--
 docs/ops/python_shell.md  |  10 ++--
 docs/ops/python_shell.zh.md   |  10 ++--
 6 files changed, 232 insertions(+), 56 deletions(-)

diff --git a/docs/dev/table/common.md b/docs/dev/table/common.md
index 705a978..5228ff2 100644
--- a/docs/dev/table/common.md
+++ b/docs/dev/table/common.md
@@ -1372,38 +1372,128 @@ print(explanation)
 
 
 
+
+
 {% highlight text %}
 == Abstract Syntax Tree ==
 LogicalUnion(all=[true])
-  LogicalFilter(condition=[LIKE($1, 'F%')])
-LogicalTableScan(table=[[_DataStreamTable_0]])
-  LogicalTableScan(table=[[_DataStreamTable_1]])
+  LogicalFilter(condition=[LIKE($1, _UTF-16LE'F%')])
+FlinkLogicalDataStreamScan(id=[1], fields=[count, word])
+  FlinkLogicalDataStreamScan(id=[2], fields=[count, word])
 
 == Optimized Logical Plan ==
-DataStreamUnion(union=[count, word])
-  DataStreamCalc(select=[count, word], where=[LIKE(word, 'F%')])
-DataStreamScan(table=[[_DataStreamTable_0]])
-  DataStreamScan(table=[[_DataStreamTable_1]])
+DataStreamUnion(all=[true], union all=[count, word])
+  DataStreamCalc(select=[count, word], where=[LIKE(word, _UTF-16LE'F%')])
+DataStreamScan(id=[1], fields=[count, word])
+  DataStreamScan(id=[2], fields=[count, word])
 
 == Physical Execution Plan ==
 Stage 1 : Data Source
-  content : collect elements with CollectionInputFormat
+   content : collect elements with CollectionInputFormat
 
 Stage 2 : Data Source
-  content : collect elements with CollectionInputFormat
+   content : collect elements with CollectionInputFormat
 
-  Stage 3 : Operator
-content : from: (count, word)
-ship_strategy : REBALANCE
+   Stage 3 : Operator
+   content : from: (count, word)
+   ship_strategy : REBALANCE
 
-Stage 4 : Operator
-  content : where: (LIKE(word, 'F%')), select: (count, word)
-  ship_strategy : FORWARD
+   Stage 4 : Operator
+   content : where: (LIKE(word, _UTF-16LE'F%')), select: 
(count, word)
+   ship_strategy : FORWARD
 
-  Stage 5 : Operator
-content : from: (count, word)
-ship_strategy : REBALANCE
+   Stage 5 : Operator
+   content : from: (count, word)
+   ship_strategy : REBALANCE
 {% endhighlight %}
+
+
+
+{% highlight text %}
+== Abstract Syntax Tree ==
+LogicalUnion(all=[true])
+  LogicalFilter(condition=[LIKE($1, _UTF-16LE'F%')])
+FlinkLogicalDataStreamScan(id=[1], fields=[count, word])
+  FlinkLogicalDataStreamScan(id=[2], fields=[count, word])
+
+== Optimized Logical Plan ==
+DataStreamUnion(all=[true], union all=[count, word])
+  DataStreamCalc(select=[count, word], where=[LIKE(word, _UTF-16LE'F%')])
+DataStreamScan(id=[1], fields=[count, word])
+  DataStreamScan(id=[2], fields=[count, word])
+
+== Physical Execution Plan ==
+Stage 1 : Data Source
+   content : collect elements with CollectionInputFormat
+
+Stage 2 : Data Source
+   content : collect elements with CollectionInputFormat
+
+   Stage 3 : Operator
+   content : from: (count, word)
+   ship_strategy : REBALANCE
+
+   Stage 4 : Operator
+   content : where: (LIKE(word, _UTF-16LE'F%')), select: 
(count, word)
+   ship_strategy : FORWARD
+
+   Stage 5 : Operator
+   content : from: (count, word)
+   ship_strategy : REBALANCE
+{% endhighlight %}
+
+
+
+{% highlight text %}
+== Abstract Syntax Tree ==
+LogicalUnion(all=[true])
+  LogicalFilter(condition=[LIKE($1, _UTF-16LE'F%')])
+FlinkLogicalDataStreamScan(id=[3], fields=[count, word])
+  FlinkLogicalDataStreamScan(id=[6], fields=[count, word])
+
+== Optimized Logical Plan ==
+DataStreamUnion(all=[true], union all=[count, word])
+  DataStreamCalc(select=[count, word], where=[LIKE(word, _UTF-16LE'F%')])
+DataStreamScan(id=[3], fields=[count, word])
+  DataStreamScan(id=[6], fields=[count, word])
+
+== Physical Execution Plan ==
+Stage 1 : Data Source
+   content : collect

[flink] branch master updated: [hotfix][python]Fix the install failure of pyflink used previous version of python 2.7 (#9016)

2019-07-09 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new ed0b3ff  [hotfix][python]Fix the install failure of pyflink used 
previous version of python 2.7 (#9016)
ed0b3ff is described below

commit ed0b3ffea3aad427dc823a2c08f7dcfd1000107d
Author: HuangXingBo 
AuthorDate: Tue Jul 9 14:12:26 2019 +0800

[hotfix][python]Fix the install failure of pyflink used previous version of 
python 2.7 (#9016)

Fix for python 2.7.10.
---
 flink-python/setup.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/flink-python/setup.py b/flink-python/setup.py
index ec7153e..b87e49f 100644
--- a/flink-python/setup.py
+++ b/flink-python/setup.py
@@ -170,7 +170,7 @@ run sdist.
 package_data={
 'pyflink': ['LICENSE', 'NOTICE', 'README.txt'],
 'pyflink.lib': ['*.jar'],
-'pyflink.opt': ['*', '*/*'],
+'pyflink.opt': ['*.*', '*/*'],
 'pyflink.conf': ['*'],
 'pyflink.log': ['*'],
 'pyflink.examples': ['*.py', '*/*.py'],



[flink] branch master updated: [FLINK-12991][python] Correct the implementation of Catalog.get_table_factory (#8956)

2019-07-08 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 86fd4b5  [FLINK-12991][python] Correct the implementation of 
Catalog.get_table_factory (#8956)
86fd4b5 is described below

commit 86fd4b5722414b0255f9e1a31492824d5ec5b126
Author: dianfu 
AuthorDate: Tue Jul 9 09:19:29 2019 +0800

[FLINK-12991][python] Correct the implementation of 
Catalog.get_table_factory (#8956)
---
 flink-python/pyflink/table/catalog.py | 9 -
 flink-python/pyflink/table/tests/test_catalog_completeness.py | 2 +-
 2 files changed, 1 insertion(+), 10 deletions(-)

diff --git a/flink-python/pyflink/table/catalog.py 
b/flink-python/pyflink/table/catalog.py
index 081d4c6..a476f66 100644
--- a/flink-python/pyflink/table/catalog.py
+++ b/flink-python/pyflink/table/catalog.py
@@ -54,15 +54,6 @@ class Catalog(object):
 """
 return self._j_catalog.getDefaultDatabase()
 
-def get_table_factory(self):
-"""
-Get an optional TableFactory instance that's responsible for 
generating source/sink for
-tables stored in this catalog.
-
-:return: An optional TableFactory instance.
-"""
-return self._j_catalog.getTableFactory()
-
 def list_databases(self):
 """
 Get the names of all databases in this catalog.
diff --git a/flink-python/pyflink/table/tests/test_catalog_completeness.py 
b/flink-python/pyflink/table/tests/test_catalog_completeness.py
index 40612e5..9474c30 100644
--- a/flink-python/pyflink/table/tests/test_catalog_completeness.py
+++ b/flink-python/pyflink/table/tests/test_catalog_completeness.py
@@ -40,7 +40,7 @@ class 
CatalogAPICompletenessTests(PythonAPICompletenessTestCase, unittest.TestCa
 @classmethod
 def excluded_methods(cls):
 # open/close are not needed in Python API as they are used internally
-return {'open', 'close'}
+return {'open', 'close', 'getTableFactory'}
 
 
 class CatalogDatabaseAPICompletenessTests(PythonAPICompletenessTestCase, 
unittest.TestCase):



[flink] branch master updated: [FLINK-12723][docs] setup IDE for Python (#8992)

2019-07-05 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new ee66854  [FLINK-12723][docs] setup IDE for Python (#8992)
ee66854 is described below

commit ee668541c7b39a30f74e44465e686153df191bd9
Author: Jincheng Sun 
AuthorDate: Fri Jul 5 19:33:43 2019 +0800

[FLINK-12723][docs] setup IDE for Python (#8992)

This closes #8992
---
 docs/flinkDev/ide_setup.md| 34 ++
 docs/flinkDev/ide_setup.zh.md | 34 ++
 2 files changed, 68 insertions(+)

diff --git a/docs/flinkDev/ide_setup.md b/docs/flinkDev/ide_setup.md
index c27e479..a6108af 100644
--- a/docs/flinkDev/ide_setup.md
+++ b/docs/flinkDev/ide_setup.md
@@ -125,4 +125,38 @@ due to version incompatibilities with the bundled Scala 
version in Scala IDE 4.4
 
 **We recommend to use IntelliJ instead (see [above](#intellij-idea))**
 
+## PyCharm
+
+A brief guide on how to set up PyCharm for development of the flink-python 
module.
+
+The following documentation describes the steps to setup PyCharm 2019.1.3
+([https://www.jetbrains.com/pycharm/download/](https://www.jetbrains.com/pycharm/download/))
+with the Flink Python sources.
+
+### Importing flink-python
+If you are in the PyCharm startup interface:
+
+1. Start PyCharm and choose "Open"
+2. Select the flink-python folder in the cloned Flink repository
+
+If you have used PyCharm to open a project:
+
+1. Select "File -> Open"
+2. Select the flink-python folder in the cloned Flink repository
+
+
+### Checkstyle For Python
+The Python code checkstyle of Apache Flink should create a flake8 external 
tool in the project. 
+
+1. Install the flake8 in the used Python interpreter(refer to 
([https://pypi.org/project/flake8/](https://pypi.org/project/flake8/)).
+2. Select "PyCharm -> Preferences... -> Tools -> External Tools -> + (The 
bottom left corner of the page on the right)", next configure the external tool.
+3. Set the "Name" to "flake8".
+4. Set the "Description" to "code style check".
+5. Set the "Program"  to the Python interpreter path(e.g. /usr/bin/python).
+6. Set the "Arguments" to "-m flake8 \-\-config=tox.ini"
+7. Set the "Working directory" to "$ProjectFileDir$"
+
+Now, you can check your Python code style by the operation:
+"Right click in any file or folder in the flink-python project -> External 
Tools -> flake8"
+
 {% top %}
diff --git a/docs/flinkDev/ide_setup.zh.md b/docs/flinkDev/ide_setup.zh.md
index 1177c6a..101a858 100644
--- a/docs/flinkDev/ide_setup.zh.md
+++ b/docs/flinkDev/ide_setup.zh.md
@@ -125,4 +125,38 @@ due to version incompatibilities with the bundled Scala 
version in Scala IDE 4.4
 
 **We recommend to use IntelliJ instead (see [above](#intellij-idea))**
 
+## PyCharm
+
+A brief guide on how to set up PyCharm for development of the flink-python 
module.
+
+The following documentation describes the steps to setup PyCharm 2019.1.3
+([https://www.jetbrains.com/pycharm/download/](https://www.jetbrains.com/pycharm/download/))
+with the Flink Python sources.
+
+### Importing flink-python
+If you are in the PyCharm startup interface:
+
+1. Start PyCharm and choose "Open"
+2. Select the flink-python folder in the cloned Flink repository
+
+If you have used PyCharm to open a project:
+
+1. Select "File -> Open"
+2. Select the flink-python folder in the cloned Flink repository
+
+
+### Checkstyle For Python
+The Python code checkstyle of Apache Flink should create a flake8 external 
tool in the project. 
+
+1. Install the flake8 in the used Python interpreter(refer to 
([https://pypi.org/project/flake8/](https://pypi.org/project/flake8/)).
+2. Select "PyCharm -> Preferences... -> Tools -> External Tools -> + (The 
bottom left corner of the page on the right)", next configure the external tool.
+3. Set the "Name" to "flake8".
+4. Set the "Description" to "code style check".
+5. Set the "Program"  to the Python interpreter path(e.g. /usr/bin/python).
+6. Set the "Arguments" to "-m flake8 \-\-config=tox.ini"
+7. Set the "Working directory" to "$ProjectFileDir$"
+
+Now, you can check your Python code style by the operation:
+"Right click in any file or folder in the flink-python project -> External 
Tools -> flake8"
+
 {% top %}



[flink] branch master updated: [FLINK-12767][python] Support user defined connectors/format

2019-07-04 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 80d7ba4  [FLINK-12767][python] Support user defined connectors/format
80d7ba4 is described below

commit 80d7ba440bd7389e76536e3b91ca227f4f850a67
Author: Dian Fu 
AuthorDate: Mon Jun 10 12:20:14 2019 +0800

[FLINK-12767][python] Support user defined connectors/format

This closes #8719
---
 flink-python/pom.xml   |   6 ++
 flink-python/pyflink/java_gateway.py   |   2 +
 flink-python/pyflink/table/descriptors.py  | 112 -
 .../pyflink/table/tests/test_descriptor.py |  35 ++-
 .../python/CustomConnectorDescriptor.java  |  72 +
 .../descriptors/python/CustomFormatDescriptor.java |  71 +
 6 files changed, 296 insertions(+), 2 deletions(-)

diff --git a/flink-python/pom.xml b/flink-python/pom.xml
index 276b890..ad4929f 100644
--- a/flink-python/pom.xml
+++ b/flink-python/pom.xml
@@ -56,6 +56,12 @@ under the License.
${project.version}
provided

+   
+   org.apache.flink
+   flink-table-common
+   ${project.version}
+   provided
+   
 

 
diff --git a/flink-python/pyflink/java_gateway.py 
b/flink-python/pyflink/java_gateway.py
index bf8ad76..9bb0b62 100644
--- a/flink-python/pyflink/java_gateway.py
+++ b/flink-python/pyflink/java_gateway.py
@@ -119,6 +119,8 @@ def import_flink_view(gateway):
 java_import(gateway.jvm, "org.apache.flink.table.api.dataview.*")
 java_import(gateway.jvm, "org.apache.flink.table.catalog.*")
 java_import(gateway.jvm, "org.apache.flink.table.descriptors.*")
+java_import(gateway.jvm, "org.apache.flink.table.descriptors.python.*")
+java_import(gateway.jvm, "org.apache.flink.table.sources.*")
 java_import(gateway.jvm, "org.apache.flink.table.sinks.*")
 java_import(gateway.jvm, "org.apache.flink.table.sources.*")
 java_import(gateway.jvm, "org.apache.flink.table.types.*")
diff --git a/flink-python/pyflink/table/descriptors.py 
b/flink-python/pyflink/table/descriptors.py
index cb8fc7b..457fc4f 100644
--- a/flink-python/pyflink/table/descriptors.py
+++ b/flink-python/pyflink/table/descriptors.py
@@ -24,6 +24,7 @@ from pyflink.table.types import _to_java_type
 from pyflink.java_gateway import get_gateway
 
 if sys.version >= '3':
+long = int
 unicode = str
 
 __all__ = [
@@ -39,7 +40,9 @@ __all__ = [
 'FileSystem',
 'ConnectTableDescriptor',
 'StreamTableDescriptor',
-'BatchTableDescriptor'
+'BatchTableDescriptor',
+'CustomConnectorDescriptor',
+'CustomFormatDescriptor'
 ]
 
 
@@ -611,6 +614,58 @@ class Json(FormatDescriptor):
 return self
 
 
+class CustomFormatDescriptor(FormatDescriptor):
+"""
+Describes the custom format of data.
+"""
+
+def __init__(self, type, version):
+"""
+Constructs a :class:`CustomFormatDescriptor`.
+
+:param type: String that identifies this format.
+:param version: Property version for backwards compatibility.
+"""
+
+if not isinstance(type, (str, unicode)):
+raise TypeError("type must be of type str.")
+if not isinstance(version, (int, long)):
+raise TypeError("version must be of type int.")
+gateway = get_gateway()
+super(CustomFormatDescriptor, self).__init__(
+gateway.jvm.CustomFormatDescriptor(type, version))
+
+def property(self, key, value):
+"""
+Adds a configuration property for the format.
+
+:param key: The property key to be set.
+:param value: The property value to be set.
+:return: This object.
+"""
+
+if not isinstance(key, (str, unicode)):
+raise TypeError("key must be of type str.")
+if not isinstance(value, (str, unicode)):
+raise TypeError("value must be of type str.")
+self._j_format_descriptor = self._j_format_descriptor.property(key, 
value)
+return self
+
+def properties(self, property_dict):
+"""
+Adds a set of properties for the format.
+
+:param property_dict: The dict object contains configuration 
properties for the format.
+  Both the keys and values should be strings.
+:return: This object.
+"""
+
+if not isinstance(property_dict, dict):
+raise TypeErro

[flink] branch master updated: [FLINK-13087][table] Add group window Aggregate operator to Table API

2019-07-04 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new a0012aa  [FLINK-13087][table] Add group window Aggregate operator to 
Table API
a0012aa is described below

commit a0012aae89ec7a56133642b58b04e5f7b155c0f4
Author: hequn8128 
AuthorDate: Thu Jul 4 11:11:01 2019 +0800

[FLINK-13087][table] Add group window Aggregate operator to Table API

This closes #8979
---
 docs/dev/table/tableApi.md |  41 +-
 .../apache/flink/table/api/WindowGroupedTable.java |  39 +-
 .../apache/flink/table/api/internal/TableImpl.java | 109 +---
 .../operations/utils/OperationTreeBuilder.java | 143 ++---
 .../table/api/stream/table/AggregateTest.scala |  45 ++-
 .../stringexpr/AggregateStringExpressionTest.scala |  25 
 .../table/validation/AggregateValidationTest.scala |  21 ++-
 .../GroupWindowTableAggregateValidationTest.scala  |  15 +++
 .../validation/GroupWindowValidationTest.scala |  35 -
 .../runtime/stream/table/GroupWindowITCase.scala   |  32 +
 10 files changed, 461 insertions(+), 44 deletions(-)

diff --git a/docs/dev/table/tableApi.md b/docs/dev/table/tableApi.md
index bd6bd38..744a82c 100644
--- a/docs/dev/table/tableApi.md
+++ b/docs/dev/table/tableApi.md
@@ -2643,6 +2643,26 @@ Table table = input
 
 
   
+Group Window Aggregate
+Batch Streaming
+  
+  
+Groups and aggregates a table on a group 
window and possibly one or more grouping keys. You have to close the 
"aggregate" with a select statement. And the select statement does not support 
"*" or aggregate functions.
+{% highlight java %}
+AggregateFunction myAggFunc = new MyMinMax();
+tableEnv.registerFunction("myAggFunc", myAggFunc);
+
+Table table = input
+.window(Tumble.over("5.minutes").on("rowtime").as("w")) // define window
+.groupBy("key, w") // group by key and window
+.aggregate("myAggFunc(a) as (x, y)")
+.select("key, x, y, w.start, w.end"); // access window properties and 
aggregate results
+{% endhighlight %}
+  
+
+
+
+  
 FlatAggregate
 Streaming
 Result Updating
@@ -2837,7 +2857,7 @@ class MyMinMax extends AggregateFunction[Row, 
MyMinMaxAcc] {
   }
 }
 
-val myAggFunc: AggregateFunction = new MyMinMax
+val myAggFunc = new MyMinMax
 val table = input
   .groupBy('key)
   .aggregate(myAggFunc('a) as ('x, 'y))
@@ -2848,6 +2868,25 @@ val table = input
 
 
   
+Group Window Aggregate
+Batch Streaming
+  
+  
+Groups and aggregates a table on a group 
window and possibly one or more grouping keys. You have to close the 
"aggregate" with a select statement. And the select statement does not support 
"*" or aggregate functions.
+{% highlight scala %}
+val myAggFunc = new MyMinMax
+val table = input
+.window(Tumble over 5.minutes on 'rowtime as 'w) // define window
+.groupBy('key, 'w) // group by key and window
+.aggregate(myAggFunc('a) as ('x, 'y))
+.select('key, 'x, 'y, 'w.start, 'w.end) // access window properties and 
aggregate results
+
+{% endhighlight %}
+  
+
+
+
+  
 FlatAggregate
 Streaming
 Result Updating
diff --git 
a/flink-table/flink-table-api-java/src/main/java/org/apache/flink/table/api/WindowGroupedTable.java
 
b/flink-table/flink-table-api-java/src/main/java/org/apache/flink/table/api/WindowGroupedTable.java
index 0e1cf84..7e5a3ac 100644
--- 
a/flink-table/flink-table-api-java/src/main/java/org/apache/flink/table/api/WindowGroupedTable.java
+++ 
b/flink-table/flink-table-api-java/src/main/java/org/apache/flink/table/api/WindowGroupedTable.java
@@ -56,6 +56,43 @@ public interface WindowGroupedTable {
Table select(Expression... fields);
 
/**
+* Performs an aggregate operation on a window grouped table. You have 
to close the
+* {@link #aggregate(String)} with a select statement. The output will 
be flattened if the
+* output type is a composite type.
+*
+* Example:
+*
+* 
+* {@code
+*   AggregateFunction aggFunc = new MyAggregateFunction();
+*   tableEnv.registerFunction("aggFunc", aggFunc);
+*   windowGroupedTable
+* .aggregate("aggFunc(a, b) as (x, y, z)")
+* .select("key, window.start, x, y, z")
+* }
+* 
+*/
+   AggregatedTable aggregate(String aggregateFunction);
+
+   /**
+* Performs an aggregate operation on a window grouped table. You have 
to close the
+* {@link #aggregate(Expression)} with a select statement. The output 
will b

[flink] branch master updated: [FLINK-13077][python] Fix the failed test in CatalogPartitionAPICompletenessTests caused by the lack of "getComment" method. (#8968)

2019-07-03 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new bd5ca42  [FLINK-13077][python] Fix the failed test in 
CatalogPartitionAPICompletenessTests caused by the lack of "getComment" method. 
(#8968)
bd5ca42 is described below

commit bd5ca420f752a16e9e81a4eda5cc4e23bfad1679
Author: WeiZhong94 <44194288+weizhon...@users.noreply.github.com>
AuthorDate: Wed Jul 3 19:40:50 2019 +0800

[FLINK-13077][python] Fix the failed test in 
CatalogPartitionAPICompletenessTests caused by the lack of "getComment" method. 
(#8968)

This close #8968
---
 flink-python/pyflink/table/catalog.py | 9 +
 1 file changed, 9 insertions(+)

diff --git a/flink-python/pyflink/table/catalog.py 
b/flink-python/pyflink/table/catalog.py
index 2748d77..6c6480c 100644
--- a/flink-python/pyflink/table/catalog.py
+++ b/flink-python/pyflink/table/catalog.py
@@ -736,6 +736,15 @@ class CatalogPartition(object):
 else:
 return None
 
+def get_comment(self):
+"""
+Get comment of the partition.
+
+:return: Comment of the partition.
+:rtype: str
+"""
+return self._j_catalog_partition.getComment()
+
 
 class CatalogFunction(object):
 """



[flink] branch master updated: [hotfix][docs] remove duplicate `to` in state doc

2019-07-02 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new d81ac48  [hotfix][docs] remove duplicate `to` in state doc
d81ac48 is described below

commit d81ac4899a1072ef38103572c6f7e459c94fa895
Author: sunjincheng121 
AuthorDate: Wed Jul 3 13:55:23 2019 +0800

[hotfix][docs] remove duplicate `to` in state doc
---
 docs/ops/state/large_state_tuning.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/ops/state/large_state_tuning.md 
b/docs/ops/state/large_state_tuning.md
index a98bb4a..e90880b 100644
--- a/docs/ops/state/large_state_tuning.md
+++ b/docs/ops/state/large_state_tuning.md
@@ -108,7 +108,7 @@ impact.
 
 For state to be snapshotted asynchronsously, you need to use a state backend 
which supports asynchronous snapshotting.
 Starting from Flink 1.3, both RocksDB-based as well as heap-based state 
backends (`filesystem`) support asynchronous
-snapshotting and use it by default. This applies to to both managed operator 
state as well as managed keyed state (incl. timers state).
+snapshotting and use it by default. This applies to both managed operator 
state as well as managed keyed state (incl. timers state).
 
 Note *The combination RocksDB state 
backend with heap-based timers currently does NOT support asynchronous 
snapshots for the timers state.
 Other state like keyed state is still snapshotted asynchronously. Please note 
that this is not a regression from previous versions and will be resolved with 
`FLINK-10026`.*



[flink-web] branch asf-site updated: Rebuild website

2019-07-02 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git


The following commit(s) were added to refs/heads/asf-site by this push:
 new 947ce88  Rebuild website
947ce88 is described below

commit 947ce88b7811825774d6681b535f483fabc14cfd
Author: sunjincheng121 
AuthorDate: Wed Jul 3 10:17:39 2019 +0800

Rebuild website
---
 .jekyll-metadata   | Bin 84463 -> 92656 bytes
 content/zh/flink-architecture.html |  49 ++---
 2 files changed, 24 insertions(+), 25 deletions(-)

diff --git a/.jekyll-metadata b/.jekyll-metadata
index bba9081..74b3891 100644
Binary files a/.jekyll-metadata and b/.jekyll-metadata differ
diff --git a/content/zh/flink-architecture.html 
b/content/zh/flink-architecture.html
index c39218e..1739ba3 100644
--- a/content/zh/flink-architecture.html
+++ b/content/zh/flink-architecture.html
@@ -182,9 +182,9 @@
 
 
 
-Apache Flink is a framework and distributed processing engine for stateful 
computations over unbounded and bounded data streams. Flink has been 
designed to run in all common cluster environments, perform 
computations at in-memory speed and at any scale.
+Apache Flink 是一个框架和分布式处理引擎,用于在无边界和有边界数据流上进行有状态的计算。Flink 
能在所有常见集群环境中运行,并能以内存速度和任意规模进行计算。
 
-Here, we explain important aspects of Flink’s architecture.
+接下来,我们来介绍一下 Flink 架构中的重要方面。
 
 
 
-Process Unbounded and Bounded 
Data
+处理无界和有界数据
 
-Any kind of data is produced as a stream of events. Credit card 
transactions, sensor measurements, machine logs, or user interactions on a 
website or mobile application, all of these data are generated as a stream.
+任何类型的数据都可以形成一种事件流。信用卡交易、传感器测量、机器日志、网站或移动应用程序上的用户交互记录,所有这些数据都形成一种流。
 
-Data can be processed as unbounded or bounded streams.
+数据可以被作为 无界 或者 有界 流来处理。
 
 
   
-Unbounded streams have a start but no defined end. 
They do not terminate and provide data as it is generated. Unbounded streams 
must be continuously processed, i.e., events must be promptly handled after 
they have been ingested. It is not possible to wait for all input data to 
arrive because the input is unbounded and will not be complete at any point in 
time. Processing unbounded data often requires that events are ingested in a 
specific order, such as the order  [...]
+无界流 
有定义流的开始,但没有定义流的结束。它们会无休止地产生数据。无界流的数据必须持续处理,即数据被摄取后需要立刻处理。我们不能等到所有数据都到达再处理,因为输入是无限的,在任何时候输入都不会完成。处理无界数据通常要求以特定顺序摄取事件,例如事件发生的顺序,以便能够推断结果的完整性。
   
   
-Bounded streams have a defined start and end. Bounded 
streams can be processed by ingesting all data before performing any 
computations. Ordered ingestion is not required to process bounded streams 
because a bounded data set can always be sorted. Processing of bounded streams 
is also known as batch processing.
+有界流 
有定义流的开始,也有定义流的结束。有界流可以在摄取所有数据后再进行计算。有界流所有数据可以被排序,所以并不需要有序摄取。有界流处理通常被称为批处理
   
 
 
@@ -211,26 +211,26 @@
   
 
 
-Apache Flink excels at processing unbounded and bounded data 
sets. Precise control of time and state enable Flink’s runtime to run 
any kind of application on unbounded streams. Bounded streams are internally 
processed by algorithms and data structures that are specifically designed for 
fixed sized data sets, yielding excellent performance.
+Apache Flink 擅长处理无界和有界数据集 精确的时间控制和状态化使得 Flink 
的运行时(runtime)能够运行任何处理无界流的应用。有界流则由一些专为固定大小数据集特殊设计的算法和数据结构进行内部处理,产生了出色的性能。
 
-Convince yourself by exploring the use cases 
that have been built on top of Flink.
+通过探索 Flink 之上构建的 用例 来加深理解。
 
-Deploy Applications Anywhere
+部署应用到任意地方
 
-Apache Flink is a distributed system and requires compute resources in 
order to execute applications. Flink integrates with all common cluster 
resource managers such as https://hadoop.apache.org/docs/stable/hadoop-yarn/hadoop-yarn-site/YARN.html;>Hadoop
 YARN, https://mesos.apache.org;>Apache Mesos, and https://kubernetes.io/;>Kubernetes but can also be setup to run as a 
stand-alone cluster.
+Apache Flink 是一个分布式系统,它需要计算资源来执行应用程序。Flink 集成了所有常见的集群资源管理器,例如 https://hadoop.apache.org/docs/stable/hadoop-yarn/hadoop-yarn-site/YARN.html;>Hadoop
 YARN、 https://mesos.apache.org;>Apache Mesos 和 https://kubernetes.io/;>Kubernetes,但同时也可以作为独立集群运行。
 
-Flink is designed to work well each of the previously listed resource 
managers. This is achieved by resource-manager-specific deployment modes that 
allow Flink to interact with each resource manager in its idiomatic way.
+Flink 
被设计为能够很好地工作在上述每个资源管理器中,这是通过资源管理器特定(resource-manager-specific)的部署模式实现的。Flink 
可以采用与当前资源管理器相适应的方式进行交互。
 
-When deploying a Flink application, Flink automatically identifies the 
required resources based on the application’s configured parallelism and 
requests them from the resource manager. In case of a failure, Flink replaces 
the failed container by requesting new resources. All communication to submit 
or control an application happens via REST calls. This eases the integration of 
Flink

[flink-web] branch asf-site updated: [FLINK-11561][docs-zh] Translate "Flink Architecture" page into Chinese

2019-07-02 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git


The following commit(s) were added to refs/heads/asf-site by this push:
 new 5a2e8a1  [FLINK-11561][docs-zh] Translate "Flink Architecture" page 
into Chinese
5a2e8a1 is described below

commit 5a2e8a1dae1c0567dcdef94ed32fe94b87441845
Author: tom_gong 
AuthorDate: Fri May 10 19:02:13 2019 +0800

[FLINK-11561][docs-zh] Translate "Flink Architecture" page into Chinese

This close #214
---
 flink-architecture.zh.md | 49 
 1 file changed, 24 insertions(+), 25 deletions(-)

diff --git a/flink-architecture.zh.md b/flink-architecture.zh.md
index 580ab30..e3935e0 100644
--- a/flink-architecture.zh.md
+++ b/flink-architecture.zh.md
@@ -16,9 +16,9 @@ title: "Apache Flink 是什么?"
 
 
 
-Apache Flink is a framework and distributed processing engine for stateful 
computations over *unbounded and bounded* data streams. Flink has been designed 
to run in *all common cluster environments*, perform computations at *in-memory 
speed* and at *any scale*.
+Apache Flink 是一个框架和分布式处理引擎,用于在*无边界和有边界*数据流上进行有状态的计算。Flink 
能在所有常见集群环境中运行,并能以内存速度和任意规模进行计算。
 
-Here, we explain important aspects of Flink's architecture.
+接下来,我们来介绍一下 Flink 架构中的重要方面。
 
 
 
-## Process Unbounded and Bounded Data
+## 处理无界和有界数据
 
-Any kind of data is produced as a stream of events. Credit card transactions, 
sensor measurements, machine logs, or user interactions on a website or mobile 
application, all of these data are generated as a stream. 
+任何类型的数据都可以形成一种事件流。信用卡交易、传感器测量、机器日志、网站或移动应用程序上的用户交互记录,所有这些数据都形成一种流。
 
-Data can be processed as *unbounded* or *bounded* streams. 
+数据可以被作为 *无界* 或者 *有界* 流来处理。
 
-1. **Unbounded streams** have a start but no defined end. They do not 
terminate and provide data as it is generated. Unbounded streams must be 
continuously processed, i.e., events must be promptly handled after they have 
been ingested. It is not possible to wait for all input data to arrive because 
the input is unbounded and will not be complete at any point in time. 
Processing unbounded data often requires that events are ingested in a specific 
order, such as the order in which events o [...]
+1. **无界流** 
有定义流的开始,但没有定义流的结束。它们会无休止地产生数据。无界流的数据必须持续处理,即数据被摄取后需要立刻处理。我们不能等到所有数据都到达再处理,因为输入是无限的,在任何时候输入都不会完成。处理无界数据通常要求以特定顺序摄取事件,例如事件发生的顺序,以便能够推断结果的完整性。
 
-2. **Bounded streams** have a defined start and end. Bounded streams can be 
processed by ingesting all data before performing any computations. Ordered 
ingestion is not required to process bounded streams because a bounded data set 
can always be sorted. Processing of bounded streams is also known as batch 
processing.
+2. **有界流** 
有定义流的开始,也有定义流的结束。有界流可以在摄取所有数据后再进行计算。有界流所有数据可以被排序,所以并不需要有序摄取。有界流处理通常被称为批处理
 
 
   
 
 
-**Apache Flink excels at processing unbounded and bounded data sets.** Precise 
control of time and state enable Flink's runtime to run any kind of application 
on unbounded streams. Bounded streams are internally processed by algorithms 
and data structures that are specifically designed for fixed sized data sets, 
yielding excellent performance. 
+**Apache Flink 擅长处理无界和有界数据集** 精确的时间控制和状态化使得 Flink 
的运行时(runtime)能够运行任何处理无界流的应用。有界流则由一些专为固定大小数据集特殊设计的算法和数据结构进行内部处理,产生了出色的性能。
 
-Convince yourself by exploring the [use cases]({{ site.baseurl 
}}/usecases.html) that have been built on top of Flink.
+通过探索 Flink 之上构建的 [用例]({{ site.baseurl }}/zh/usecases.html) 来加深理解。
 
-## Deploy Applications Anywhere
+## 部署应用到任意地方
 
-Apache Flink is a distributed system and requires compute resources in order 
to execute applications. Flink integrates with all common cluster resource 
managers such as [Hadoop 
YARN](https://hadoop.apache.org/docs/stable/hadoop-yarn/hadoop-yarn-site/YARN.html),
 [Apache Mesos](https://mesos.apache.org), and 
[Kubernetes](https://kubernetes.io/) but can also be setup to run as a 
stand-alone cluster.
+Apache Flink 是一个分布式系统,它需要计算资源来执行应用程序。Flink 集成了所有常见的集群资源管理器,例如 [Hadoop 
YARN](https://hadoop.apache.org/docs/stable/hadoop-yarn/hadoop-yarn-site/YARN.html)、
 [Apache Mesos](https://mesos.apache.org) 和 
[Kubernetes](https://kubernetes.io/),但同时也可以作为独立集群运行。
 
-Flink is designed to work well each of the previously listed resource 
managers. This is achieved by resource-manager-specific deployment modes that 
allow Flink to interact with each resource manager in its idiomatic way. 
+Flink 
被设计为能够很好地工作在上述每个资源管理器中,这是通过资源管理器特定(resource-manager-specific)的部署模式实现的。Flink 
可以采用与当前资源管理器相适应的方式进行交互。
 
-When deploying a Flink application, Flink automatically identifies the 
required resources based on the application's configured parallelism and 
requests them from the resource manager. In case of a failure, Flink replaces 
the failed container by requesting new resources. All communication to submit 
or control an application happ

[flink-web] branch asf-site updated: Rebuild website

2019-07-02 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git


The following commit(s) were added to refs/heads/asf-site by this push:
 new b28e44e  Rebuild website
b28e44e is described below

commit b28e44efb01ee70d24899f406884d9f507c844e4
Author: sunjincheng121 
AuthorDate: Wed Jul 3 09:30:26 2019 +0800

Rebuild website
---
 .jekyll-metadata | Bin 0 -> 84463 bytes
 content/blog/feed.xml| 147 +++
 content/blog/index.html  |  38 ++-
 content/blog/page2/index.html|  38 +--
 content/blog/page3/index.html|  40 +--
 content/blog/page4/index.html|  40 +--
 content/blog/page5/index.html|  40 +--
 content/blog/page6/index.html|  38 ++-
 content/blog/page7/index.html|  36 ++-
 content/blog/page8/index.html|  38 +--
 content/blog/page9/index.html|  25 ++
 content/contributing/contribute-code.html|   2 +-
 content/contributing/improve-website.html|   2 +-
 content/downloads.html   |  28 +--
 content/index.html   |   8 +-
 content/news/2019/07/02/release-1.8.1.html   | 355 +++
 content/zh/contributing/improve-website.html |   2 +-
 content/zh/downloads.html|  32 +--
 content/zh/index.html|   8 +-
 19 files changed, 764 insertions(+), 153 deletions(-)

diff --git a/.jekyll-metadata b/.jekyll-metadata
new file mode 100644
index 000..bba9081
Binary files /dev/null and b/.jekyll-metadata differ
diff --git a/content/blog/feed.xml b/content/blog/feed.xml
index cfadfc2..1c51569 100644
--- a/content/blog/feed.xml
+++ b/content/blog/feed.xml
@@ -7,6 +7,153 @@
 https://flink.apache.org/blog/feed.xml; rel="self" 
type="application/rss+xml" />
 
 
+Apache Flink 1.8.1 Released
+pThe Apache Flink community released the first bugfix 
version of the Apache Flink 1.8 series./p
+
+pThis release includes more than 40 fixes and minor improvements for 
Flink 1.8.1. The list below includes a detailed list of all improvements, 
sub-tasks and bug fixes./p
+
+pWe highly recommend all users to upgrade to Flink 1.8.1./p
+
+pUpdated Maven dependencies:/p
+
+div class=highlightprecode 
class=language-xmlspan 
class=ntlt;dependencygt;/span
+  span 
class=ntlt;groupIdgt;/spanorg.apache.flinkspan
 class=ntlt;/groupIdgt;/span
+  span 
class=ntlt;artifactIdgt;/spanflink-javaspan
 class=ntlt;/artifactIdgt;/span
+  span 
class=ntlt;versiongt;/span1.8.1span 
class=ntlt;/versiongt;/span
+span class=ntlt;/dependencygt;/span
+span class=ntlt;dependencygt;/span
+  span 
class=ntlt;groupIdgt;/spanorg.apache.flinkspan
 class=ntlt;/groupIdgt;/span
+  span 
class=ntlt;artifactIdgt;/spanflink-streaming-java_2.11span
 class=ntlt;/artifactIdgt;/span
+  span 
class=ntlt;versiongt;/span1.8.1span 
class=ntlt;/versiongt;/span
+span class=ntlt;/dependencygt;/span
+span class=ntlt;dependencygt;/span
+  span 
class=ntlt;groupIdgt;/spanorg.apache.flinkspan
 class=ntlt;/groupIdgt;/span
+  span 
class=ntlt;artifactIdgt;/spanflink-clients_2.11span
 class=ntlt;/artifactIdgt;/span
+  span 
class=ntlt;versiongt;/span1.8.1span 
class=ntlt;/versiongt;/span
+span 
class=ntlt;/dependencygt;/span/code/pre/div
+
+pYou can find the binaries on the updated a 
href=http://flink.apache.org/downloads.htmlDownloads 
page/a./p
+
+pList of resolved issues:/p
+
+h2Sub-task
+/h2
+ul
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-10921FLINK-10921/a;]
 - Prioritize shard consumers in Kinesis Consumer by event time 
+/li
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-12617FLINK-12617/a;]
 - StandaloneJobClusterEntrypoint should default to random JobID for 
non-HA setups 
+/li
+/ul
+
+h2Bug
+/h2
+ul
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-9445FLINK-9445/a;]
 - scala-shell uses plain java command
+/li
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-10455FLINK-10455/a;]
 - Potential Kafka producer leak in case of failures
+/li
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-10941FLINK-10941/a;]
 - Slots prematurely released which still contain unconsumed data 
+/li
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-11059FLINK-11059/a;]
 - JobMaster may continue using an invalid slot if releasing idle slot 
meet a timeout
+/li
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-11107FLINK-11107/a;]
 - Avoid memory stateBackend to create arbitrary folders under HA path 
when no checkpoint path configured
+/li
+li[a 
href=https://issues.apache.org/jira/browse/FLINK-11897FLINK-11897/a;]
 - ExecutionGraphSuspendTest does not wait for all tasks to be submitted
+/li
+li[a 
href=https://issues.ap

[flink-web] branch asf-site updated: Add Apache Flink release 1.8.1

2019-07-02 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git


The following commit(s) were added to refs/heads/asf-site by this push:
 new 70bcf43  Add Apache Flink release 1.8.1
70bcf43 is described below

commit 70bcf439500ba2565a027f2c137101268bf62027
Author: sunjincheng121 
AuthorDate: Mon Jun 17 16:04:54 2019 +0800

Add Apache Flink release 1.8.1
---
 _config.yml|  54 ++---
 _posts/2019-07-02-release-1.8.1.md | 152 +
 downloads.md   |   2 +-
 downloads.zh.md|   2 +-
 4 files changed, 181 insertions(+), 29 deletions(-)

diff --git a/_config.yml b/_config.yml
index a209cbd..11eea16 100644
--- a/_config.yml
+++ b/_config.yml
@@ -9,7 +9,7 @@ url: https://flink.apache.org
 
 DOCS_BASE_URL: https://ci.apache.org/projects/flink/
 
-FLINK_VERSION_STABLE: 1.8.0
+FLINK_VERSION_STABLE: 1.8.1
 FLINK_VERSION_STABLE_SHORT: 1.8
 
 FLINK_VERSION_LATEST: 1.9-SNAPSHOT
@@ -55,23 +55,23 @@ flink_releases:
   -
   version_short: 1.8
   binary_release:
-  name: "Apache Flink 1.8.0"
+  name: "Apache Flink 1.8.1"
   scala_211:
-  id: "180-download_211"
-  url: 
"https://www.apache.org/dyn/closer.lua/flink/flink-1.8.0/flink-1.8.0-bin-scala_2.11.tgz;
-  asc_url: 
"https://www.apache.org/dist/flink/flink-1.8.0/flink-1.8.0-bin-scala_2.11.tgz.asc;
-  sha512_url: 
"https://www.apache.org/dist/flink/flink-1.8.0/flink-1.8.0-bin-scala_2.11.tgz.sha512;
+  id: "181-download_211"
+  url: 
"https://www.apache.org/dyn/closer.lua/flink/flink-1.8.1/flink-1.8.1-bin-scala_2.11.tgz;
+  asc_url: 
"https://www.apache.org/dist/flink/flink-1.8.1/flink-1.8.1-bin-scala_2.11.tgz.asc;
+  sha512_url: 
"https://www.apache.org/dist/flink/flink-1.8.1/flink-1.8.1-bin-scala_2.11.tgz.sha512;
   scala_212:
-  id: "180-download_212"
-  url: 
"https://www.apache.org/dyn/closer.lua/flink/flink-1.8.0/flink-1.8.0-bin-scala_2.12.tgz;
-  asc_url: 
"https://www.apache.org/dist/flink/flink-1.8.0/flink-1.8.0-bin-scala_2.12.tgz.asc;
-  sha512_url: 
"https://www.apache.org/dist/flink/flink-1.8.0/flink-1.8.0-bin-scala_2.12.tgz.sha512;
+  id: "181-download_212"
+  url: 
"https://www.apache.org/dyn/closer.lua/flink/flink-1.8.1/flink-1.8.1-bin-scala_2.12.tgz;
+  asc_url: 
"https://www.apache.org/dist/flink/flink-1.8.1/flink-1.8.1-bin-scala_2.12.tgz.asc;
+  sha512_url: 
"https://www.apache.org/dist/flink/flink-1.8.1/flink-1.8.1-bin-scala_2.12.tgz.sha512;
   source_release:
-  name: "Apache Flink 1.8.0"
-  id: "180-download-source"
-  url: 
"https://www.apache.org/dyn/closer.lua/flink/flink-1.8.0/flink-1.8.0-src.tgz;
-  asc_url: 
"https://www.apache.org/dist/flink/flink-1.8.0/flink-1.8.0-src.tgz.asc;
-  sha512_url: 
"https://www.apache.org/dist/flink/flink-1.8.0/flink-1.8.0-src.tgz.sha512;
+  name: "Apache Flink 1.8.1"
+  id: "181-download-source"
+  url: 
"https://www.apache.org/dyn/closer.lua/flink/flink-1.8.1/flink-1.8.1-src.tgz;
+  asc_url: 
"https://www.apache.org/dist/flink/flink-1.8.1/flink-1.8.1-src.tgz.asc;
+  sha512_url: 
"https://www.apache.org/dist/flink/flink-1.8.1/flink-1.8.1-src.tgz.sha512;
   optional_components:
 -
   name: "Pre-bundled Hadoop 2.4.1"
@@ -109,26 +109,26 @@ flink_releases:
   name: "Avro SQL Format"
   category: "SQL Formats"
   scala_dependent: false
-  id: 180-sql-format-avro
-  url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-avro/1.8.0/flink-avro-1.8.0.jar
-  asc_url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-avro/1.8.0/flink-avro-1.8.0.jar.asc
-  sha_url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-avro/1.8.0/flink-avro-1.8.0.jar.sha1
+  id: 181-sql-format-avro
+  url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-avro/1.8.1/flink-avro-1.8.1.jar
+  asc_url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-avro/1.8.1/flink-avro-1.8.1.jar.asc
+  sha_url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-avro/1.8.1/flink-avro-1.8.1.jar.sha1
 -
   name: "CSV SQL Format"
   category: "SQL Formats"
   scala_dependent: false
-  id: 180-sql-format-csv
-  url: 
https://repo.maven.apache.org/maven2/org/apache/flink/flink-csv/1.8.0/f

[flink] branch master updated: [hotfix][python] Remove the redundant 'python setup.py install' from tox.ini

2019-07-02 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 686bc84  [hotfix][python] Remove the redundant 'python setup.py 
install' from tox.ini
686bc84 is described below

commit 686bc841398dd14f054df8bf97c6d9ef8d8d99d9
Author: Dian Fu 
AuthorDate: Fri Jun 28 20:57:04 2019 +0800

[hotfix][python] Remove the redundant 'python setup.py install' from tox.ini

This close #8947
---
 flink-python/tox.ini | 1 -
 1 file changed, 1 deletion(-)

diff --git a/flink-python/tox.ini b/flink-python/tox.ini
index c475adf..e0a6c29 100644
--- a/flink-python/tox.ini
+++ b/flink-python/tox.ini
@@ -30,7 +30,6 @@ deps =
 pytest
 commands =
 python --version
-python setup.py install --force
 pytest
 bash ./dev/run_pip_test.sh
 



svn commit: r34725 - /release/flink/flink-1.8.0/

2019-07-01 Thread jincheng
Author: jincheng
Date: Tue Jul  2 05:51:56 2019
New Revision: 34725

Log:
remove flink-1.8.0

Removed:
release/flink/flink-1.8.0/



svn commit: r34724 - /dev/flink/flink-1.8.1-rc1/ /release/flink/flink-1.8.1/

2019-07-01 Thread jincheng
Author: jincheng
Date: Tue Jul  2 05:48:27 2019
New Revision: 34724

Log:
Release Flink 1.8.1

Added:
release/flink/flink-1.8.1/
  - copied from r34723, dev/flink/flink-1.8.1-rc1/
Removed:
dev/flink/flink-1.8.1-rc1/



[flink] branch master updated: [FLINK-11147][table][docs] Add documentation for TableAggregate Function

2019-07-01 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 6a12908  [FLINK-11147][table][docs] Add documentation for 
TableAggregate Function
6a12908 is described below

commit 6a12908b15c398e37f8603cd84e0d30e14d07784
Author: hequn8128 
AuthorDate: Fri Jun 21 12:48:11 2019 +0800

[FLINK-11147][table][docs] Add documentation for TableAggregate Function

This close #8669
---
 docs/dev/table/tableApi.md|   4 +-
 docs/dev/table/tableApi.zh.md |   4 +-
 docs/dev/table/udfs.md| 704 +++---
 docs/dev/table/udfs.zh.md | 704 +++---
 docs/fig/udtagg-mechanism.png | Bin 0 -> 150838 bytes
 5 files changed, 1328 insertions(+), 88 deletions(-)

diff --git a/docs/dev/table/tableApi.md b/docs/dev/table/tableApi.md
index 4fb9f75..a0bd51a 100644
--- a/docs/dev/table/tableApi.md
+++ b/docs/dev/table/tableApi.md
@@ -2645,7 +2645,7 @@ Table table = input
   
   
 Similar to a GroupBy Aggregation. Groups the rows on the 
grouping keys with the following running table aggregation operator to 
aggregate rows group-wise. The difference from an AggregateFunction is that 
TableAggregateFunction may return 0 or more records for a group. You have to 
close the "flatAggregate" with a select statement. And the select statement 
does not support aggregate functions.
-Instead of using emitValue to output results, you can 
also use the emitUpdateWithRetract method. Different from 
emitValue, emitUpdateWithRetract is used to emit 
values that have been updated. This method outputs data incrementally in 
retract mode, i.e., once there is an update, we have to retract old records 
before sending new updated ones. The emitUpdateWithRetract method 
will be used in preference to the  [...]
+Instead of using emitValue to output results, you can 
also use the emitUpdateWithRetract method. Different from 
emitValue, emitUpdateWithRetract is used to emit 
values that have been updated. This method outputs data incrementally in 
retract mode, i.e., once there is an update, we have to retract old records 
before sending new updated ones. The emitUpdateWithRetract method 
will be used in preference to the  [...]
 {% highlight java %}
 /**
  * Accumulator for Top2.
@@ -2850,7 +2850,7 @@ val table = input
   
   
 Similar to a GroupBy Aggregation. Groups the rows on the 
grouping keys with the following running table aggregation operator to 
aggregate rows group-wise. The difference from an AggregateFunction is that 
TableAggregateFunction may return 0 or more records for a group. You have to 
close the "flatAggregate" with a select statement. And the select statement 
does not support aggregate functions.
-Instead of using emitValue to output results, you can 
also use the emitUpdateWithRetract method. Different from 
emitValue, emitUpdateWithRetract is used to emit 
values that have been updated. This method outputs data incrementally in 
retract mode, i.e., once there is an update, we have to retract old records 
before sending new updated ones. The emitUpdateWithRetract method 
will be used in preference to the  [...]
+Instead of using emitValue to output results, you can 
also use the emitUpdateWithRetract method. Different from 
emitValue, emitUpdateWithRetract is used to emit 
values that have been updated. This method outputs data incrementally in 
retract mode, i.e., once there is an update, we have to retract old records 
before sending new updated ones. The emitUpdateWithRetract method 
will be used in preference to the  [...]
 {% highlight scala %}
 import java.lang.{Integer => JInteger}
 import org.apache.flink.table.api.Types
diff --git a/docs/dev/table/tableApi.zh.md b/docs/dev/table/tableApi.zh.md
index 33ad6f9..7a75c0a 100644
--- a/docs/dev/table/tableApi.zh.md
+++ b/docs/dev/table/tableApi.zh.md
@@ -2644,7 +2644,7 @@ Table table = input
   
   
 Similar to a GroupBy Aggregation. Groups the rows on the 
grouping keys with the following running table aggregation operator to 
aggregate rows group-wise. The difference from an AggregateFunction is that 
TableAggregateFunction may return 0 or more records for a group. You have to 
close the "flatAggregate" with a select statement. And the select statement 
does not support aggregate functions.
-Instead of using emitValue to output results, you can 
also use the emitUpdateWithRetract method. Different from 
emitValue, emitUpdateWithRetract is used to emit 
values that have been updated. This method outputs data incrementally in 
retract mode, i.e., once there is an update, we have to retract old records 
before sending new updated ones. The emitUpdateWithRetrac

[flink] branch master updated: [FLINK-12897][python][docs] Improve the Python Table API docs by adding more examples.

2019-06-30 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 534d53b  [FLINK-12897][python][docs] Improve the Python Table API docs 
by adding more examples.
534d53b is described below

commit 534d53bf828404ef3dcd7aad9e7a8b6528606254
Author: Wei Zhong 
AuthorDate: Thu Jun 27 20:30:36 2019 +0800

[FLINK-12897][python][docs] Improve the Python Table API docs by adding 
more examples.

This closes #8916
---
 docs/dev/event_time.md |  11 +
 docs/dev/event_time.zh.md  |  11 +
 docs/dev/event_timestamps_watermarks.md|   6 +
 docs/dev/event_timestamps_watermarks.zh.md |   6 +
 docs/dev/stream/state/checkpointing.md |  28 ++
 docs/dev/stream/state/checkpointing.zh.md  |  28 ++
 docs/dev/stream/state/state_backends.md|   6 +
 docs/dev/stream/state/state_backends.zh.md |   6 +
 docs/dev/table/common.md   | 210 -
 docs/dev/table/common.zh.md| 206 -
 docs/dev/table/connect.md  | 335 
 docs/dev/table/connect.zh.md   | 336 +
 docs/dev/table/functions.md|  40 ++-
 docs/dev/table/functions.zh.md |  40 ++-
 docs/dev/table/sql.md  |  24 ++
 docs/dev/table/sql.zh.md   |  25 ++
 docs/dev/table/streaming/query_configuration.md|  35 +++
 docs/dev/table/streaming/query_configuration.zh.md |  35 +++
 docs/dev/table/streaming/time_attributes.md|  11 +
 docs/dev/table/streaming/time_attributes.zh.md |  11 +
 docs/dev/table/tableApi.md |  12 +-
 docs/dev/table/tableApi.zh.md  |  12 +-
 .../pyflink/dataset/execution_environment.py   |   8 +-
 .../datastream/stream_execution_environment.py |   4 +-
 flink-python/pyflink/table/__init__.py |   3 +-
 flink-python/pyflink/table/descriptors.py  |  10 +-
 flink-python/pyflink/table/query_config.py |   9 +
 flink-python/pyflink/table/sinks.py|  12 +-
 flink-python/pyflink/table/table.py|  17 +-
 flink-python/pyflink/table/table_environment.py| 167 --
 30 files changed, 1564 insertions(+), 100 deletions(-)

diff --git a/docs/dev/event_time.md b/docs/dev/event_time.md
index 1d747aa..5790746 100644
--- a/docs/dev/event_time.md
+++ b/docs/dev/event_time.md
@@ -131,6 +131,17 @@ stream
 .addSink(...)
 {% endhighlight %}
 
+
+{% highlight python %}
+env = StreamExecutionEnvironment.get_execution_environment()
+
+env.set_stream_time_characteristic(TimeCharacteristic.ProcessingTime)
+
+# alternatively:
+# env.set_stream_time_characteristic(TimeCharacteristic.IngestionTime)
+# env.set_stream_time_characteristic(TimeCharacteristic.EventTime)
+{% endhighlight %}
+
 
 
 
diff --git a/docs/dev/event_time.zh.md b/docs/dev/event_time.zh.md
index 1d747aa..5790746 100644
--- a/docs/dev/event_time.zh.md
+++ b/docs/dev/event_time.zh.md
@@ -131,6 +131,17 @@ stream
 .addSink(...)
 {% endhighlight %}
 
+
+{% highlight python %}
+env = StreamExecutionEnvironment.get_execution_environment()
+
+env.set_stream_time_characteristic(TimeCharacteristic.ProcessingTime)
+
+# alternatively:
+# env.set_stream_time_characteristic(TimeCharacteristic.IngestionTime)
+# env.set_stream_time_characteristic(TimeCharacteristic.EventTime)
+{% endhighlight %}
+
 
 
 
diff --git a/docs/dev/event_timestamps_watermarks.md 
b/docs/dev/event_timestamps_watermarks.md
index cb1c5d4..fdb716b 100644
--- a/docs/dev/event_timestamps_watermarks.md
+++ b/docs/dev/event_timestamps_watermarks.md
@@ -44,6 +44,12 @@ val env = StreamExecutionEnvironment.getExecutionEnvironment
 env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime)
 {% endhighlight %}
 
+
+{% highlight python %}
+env = StreamExecutionEnvironment.get_execution_environment()
+env.set_stream_time_characteristic(TimeCharacteristic.EventTime)
+{% endhighlight %}
+
 
 
 ## Assigning Timestamps
diff --git a/docs/dev/event_timestamps_watermarks.zh.md 
b/docs/dev/event_timestamps_watermarks.zh.md
index cb1c5d4..fdb716b 100644
--- a/docs/dev/event_timestamps_watermarks.zh.md
+++ b/docs/dev/event_timestamps_watermarks.zh.md
@@ -44,6 +44,12 @@ val env = StreamExecutionEnvironment.getExecutionEnvironment
 env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime)
 {% endhighlight %}
 
+
+{% highlight python %}
+env = StreamExecutionEnvironment.get_execution_environment()
+env.set_stream_time_characteristic(TimeCharacteristic.EventTime)
+{% endhighlight %}
+
 
 
 ## Assigning Timestamps
diff --git a/docs/dev/stream/state/checkpointing.md 
b/docs/dev/stream/state/checkpointing.md
index

[flink] branch master updated: [hotfix][python] Aligns with Java Table API by removing methods exec_env and query_config

2019-06-28 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 2d7e64a  [hotfix][python] Aligns with Java Table API by removing 
methods exec_env and query_config
2d7e64a is described below

commit 2d7e64aa481ad03069ecdbdffbc8a22254e94d72
Author: Dian Fu 
AuthorDate: Thu Jun 27 15:47:50 2019 +0800

[hotfix][python] Aligns with Java Table API by removing methods exec_env 
and query_config

This closes #8910
---
 flink-python/dev/pip_test_code.py  |  6 +--
 .../dataset/tests/test_execution_environment.py|  2 +-
 .../tests/test_stream_execution_environment.py |  2 +-
 flink-python/pyflink/shell.py  | 37 +---
 .../pyflink/table/examples/batch/word_count.py | 10 +++--
 flink-python/pyflink/table/table.py|  2 +-
 flink-python/pyflink/table/table_environment.py| 51 +-
 flink-python/pyflink/table/tests/test_calc.py  |  2 +-
 .../pyflink/table/tests/test_descriptor.py |  4 +-
 .../pyflink/table/tests/test_shell_example.py  | 12 ++---
 .../table/tests/test_table_environment_api.py  | 17 
 flink-python/tox.ini   |  2 +-
 12 files changed, 53 insertions(+), 94 deletions(-)

diff --git a/flink-python/dev/pip_test_code.py 
b/flink-python/dev/pip_test_code.py
index c9a4798..29f9841 100644
--- a/flink-python/dev/pip_test_code.py
+++ b/flink-python/dev/pip_test_code.py
@@ -16,7 +16,7 @@
 # limitations under the License.
 

 # test pyflink shell environment
-from pyflink.shell import bt_env, FileSystem, OldCsv, DataTypes, Schema
+from pyflink.shell import b_env, bt_env, FileSystem, OldCsv, DataTypes, Schema
 
 import tempfile
 import os
@@ -28,7 +28,7 @@ if os.path.exists(sink_path):
 os.remove(sink_path)
 else:
 shutil.rmtree(sink_path)
-bt_env.exec_env().set_parallelism(1)
+b_env.set_parallelism(1)
 t = bt_env.from_elements([(1, 'hi', 'hello'), (2, 'hi', 'hello')], ['a', 'b', 
'c'])
 bt_env.connect(FileSystem().path(sink_path)) \
 .with_format(OldCsv()
@@ -44,7 +44,7 @@ bt_env.connect(FileSystem().path(sink_path)) \
 
 t.select("a + 1, b, c").insert_into("batch_sink")
 
-bt_env.exec_env().execute()
+b_env.execute()
 
 with open(sink_path, 'r') as f:
 lines = f.read()
diff --git a/flink-python/pyflink/dataset/tests/test_execution_environment.py 
b/flink-python/pyflink/dataset/tests/test_execution_environment.py
index 7adac00..9dbed96 100644
--- a/flink-python/pyflink/dataset/tests/test_execution_environment.py
+++ b/flink-python/pyflink/dataset/tests/test_execution_environment.py
@@ -110,6 +110,6 @@ class ExecutionEnvironmentTests(PyFlinkTestCase):
 CsvTableSink(field_names, field_types, tmp_csv))
 t_env.scan("Orders").insert_into("Results")
 
-plan = t_env.exec_env().get_execution_plan()
+plan = self.env.get_execution_plan()
 
 json.loads(plan)
diff --git 
a/flink-python/pyflink/datastream/tests/test_stream_execution_environment.py 
b/flink-python/pyflink/datastream/tests/test_stream_execution_environment.py
index 43768fa..b02f660 100644
--- a/flink-python/pyflink/datastream/tests/test_stream_execution_environment.py
+++ b/flink-python/pyflink/datastream/tests/test_stream_execution_environment.py
@@ -187,6 +187,6 @@ class StreamExecutionEnvironmentTests(PyFlinkTestCase):
 CsvTableSink(field_names, field_types, tmp_csv))
 t_env.scan("Orders").insert_into("Results")
 
-plan = t_env.exec_env().get_execution_plan()
+plan = self.env.get_execution_plan()
 
 json.loads(plan)
diff --git a/flink-python/pyflink/shell.py b/flink-python/pyflink/shell.py
index 2ea3a6b..97b7065 100644
--- a/flink-python/pyflink/shell.py
+++ b/flink-python/pyflink/shell.py
@@ -20,8 +20,9 @@ import codecs
 import platform
 import sys
 
-from pyflink.dataset import ExecutionEnvironment
-from pyflink.datastream import StreamExecutionEnvironment
+from pyflink.common import *
+from pyflink.dataset import *
+from pyflink.datastream import *
 from pyflink.table import *
 from pyflink.table.catalog import *
 from pyflink.table.descriptors import *
@@ -80,7 +81,7 @@ welcome_msg = u'''
 
 NOTE: Use the prebound Table Environment to implement batch or streaming Table 
programs.
 
-  Batch - Use the 'bt_env' variable
+  Batch - Use 'b_env' and 'bt_env' variables
 
 *
 * import tempfile
@@ -92,25 +93,25 @@ NOTE: Use the prebound Table Environment to implement batch 
or streaming Table p
 * os.remove(sink_path)
 * else:
 * shutil.rmtree(sink_path)
-* bt_env.exec_env().set_parallelism(1)
+* b_env.set_parallelism(1)

[flink] branch master updated: [FLINK-12962][python] Allows pyflink to be pip installed.

2019-06-27 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new f059b19  [FLINK-12962][python] Allows pyflink to be pip installed.
f059b19 is described below

commit f059b1981b8d2a62acfd2fac3456b8ca3a5bd18f
Author: Wei Zhong 
AuthorDate: Tue Jun 25 09:51:05 2019 +0800

[FLINK-12962][python] Allows pyflink to be pip installed.

This closes #8863
---
 docs/flinkDev/building.md  |  15 ++
 docs/flinkDev/building.zh.md   |  15 ++
 flink-dist/src/main/flink-bin/bin/config.sh|   5 +-
 .../src/main/flink-bin/bin/find-flink-home.sh  |  11 +-
 flink-dist/src/main/flink-bin/bin/pyflink-shell.sh |   5 +-
 flink-python/{pyflink/version.py => MANIFEST.in}   |  14 +-
 flink-python/dev/pip_test_code.py  |  53 ++
 .../{pyflink/version.py => dev/run_pip_test.sh}|   5 +-
 flink-python/pyflink/find_flink_home.py|  31 ++-
 flink-python/pyflink/shell.py  |  12 +-
 flink-python/pyflink/version.py|   6 +-
 flink-python/{pyflink/version.py => setup.cfg} |   6 +-
 flink-python/setup.py  | 210 ++---
 flink-python/tox.ini   |   3 +
 tools/releasing/update_branch_version.sh   |   6 +
 15 files changed, 355 insertions(+), 42 deletions(-)

diff --git a/docs/flinkDev/building.md b/docs/flinkDev/building.md
index 521bc9f..c78bb3b 100644
--- a/docs/flinkDev/building.md
+++ b/docs/flinkDev/building.md
@@ -58,6 +58,21 @@ mvn clean install -DskipTests -Dfast
 
 The default build adds a Flink-specific JAR for Hadoop 2, to allow using Flink 
with HDFS and YARN.
 
+## Build PyFlink
+
+If you want to build a PyFlink package that can be used for pip installation, 
you need to build Flink jars first, as described in [Build Flink](##Build 
Flink).
+Then go to the root directory of flink source code and run this command to 
build the sdist package and wheel package:
+
+{% highlight bash %}
+cd flink-python; python3 setup.py sdist bdist_wheel
+{% endhighlight %}
+
+The sdist and wheel package will be found under `./flink-python/dist/`. Either 
of them could be used for pip installation, such as:
+
+{% highlight bash %}
+pip install dist/*.tar.gz
+{% endhighlight %}
+
 ## Dependency Shading
 
 Flink [shades away](https://maven.apache.org/plugins/maven-shade-plugin/) some 
of the libraries it uses, in order to avoid version clashes with user programs 
that use different versions of these libraries. Among the shaded libraries are 
*Google Guava*, *Asm*, *Apache Curator*, *Apache HTTP Components*, *Netty*, and 
others.
diff --git a/docs/flinkDev/building.zh.md b/docs/flinkDev/building.zh.md
index c48de50..aa70511 100644
--- a/docs/flinkDev/building.zh.md
+++ b/docs/flinkDev/building.zh.md
@@ -58,6 +58,21 @@ mvn clean install -DskipTests -Dfast
 
 The default build adds a Flink-specific JAR for Hadoop 2, to allow using Flink 
with HDFS and YARN.
 
+## 构建PyFlink
+
+如果您想构建一个可用于pip安装的PyFlink包,您需要先构建Flink的Jar包,如[构建Flink](##Build Flink)中所述。
+之后,进入Flink源码根目录,并执行以下命令,构建PyFlink的源码发布包和wheel包:
+
+{% highlight bash %}
+cd flink-python; python3 setup.py sdist bdist_wheel
+{% endhighlight %}
+
+构建好的源码发布包和wheel包位于`./flink-python/dist/`目录下。它们均可使用pip安装,比如:
+
+{% highlight bash %}
+pip install dist/*.tar.gz
+{% endhighlight %}
+
 ## Dependency Shading
 
 Flink [shades away](https://maven.apache.org/plugins/maven-shade-plugin/) some 
of the libraries it uses, in order to avoid version clashes with user programs 
that use different versions of these libraries. Among the shaded libraries are 
*Google Guava*, *Asm*, *Apache Curator*, *Apache HTTP Components*, *Netty*, and 
others.
diff --git a/flink-dist/src/main/flink-bin/bin/config.sh 
b/flink-dist/src/main/flink-bin/bin/config.sh
index 79fa6ad..4dc57df 100755
--- a/flink-dist/src/main/flink-bin/bin/config.sh
+++ b/flink-dist/src/main/flink-bin/bin/config.sh
@@ -296,7 +296,10 @@ bin=`dirname "$target"`
 SYMLINK_RESOLVED_BIN=`cd "$bin"; pwd -P`
 
 # Define the main directory of the flink installation
-FLINK_HOME=`dirname "$SYMLINK_RESOLVED_BIN"`
+# If config.sh is called by pyflink-shell.sh in python bin directory(pip 
installed), then do not need to set the FLINK_HOME here.
+if [ -z "$_FLINK_HOME_DETERMINED" ]; then
+FLINK_HOME=`dirname "$SYMLINK_RESOLVED_BIN"`
+fi
 FLINK_LIB_DIR=$FLINK_HOME/lib
 FLINK_PLUGINS_DIR=$FLINK_HOME/plugins
 FLINK_OPT_DIR=$FLINK_HOME/opt
diff --git a/flink-python/pyflink/version.py 
b/flink-dist/src/main/flink-bin/bin/find-flink-home.sh
old mode 100644
new mode 100755
similarity index 72%
copy from flink-python/pyflink/version.py
copy to flink-dist/src/main/flink-bin/bin/find-flink-home.sh
index ca27a42..e0fe95f
--- a/flink-py

[flink] branch master updated: [FLINK-12722][docs] Adds Python Table API tutorial

2019-06-27 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new fd2865a  [FLINK-12722][docs] Adds Python Table API tutorial
fd2865a is described below

commit fd2865a54fc0790d6774d1b645da4a65e3f5d741
Author: Dian Fu 
AuthorDate: Mon Jun 24 22:22:35 2019 +0800

[FLINK-12722][docs] Adds Python Table API tutorial

This closes #8907
---
 docs/tutorials/python_table_api.md| 158 ++
 docs/tutorials/python_table_api.zh.md | 146 +++
 2 files changed, 304 insertions(+)

diff --git a/docs/tutorials/python_table_api.md 
b/docs/tutorials/python_table_api.md
new file mode 100644
index 000..8a6e866
--- /dev/null
+++ b/docs/tutorials/python_table_api.md
@@ -0,0 +1,158 @@
+---
+title: "Python API Tutorial"
+nav-title: Python API
+nav-parent_id: apitutorials
+nav-pos: 10
+---
+
+
+* This will be replaced by the TOC
+{:toc}
+
+In this guide we will start from scratch and go from setting up a Flink Python 
project
+to running a Python Table API program.
+
+## Setting up a Python Project
+
+Firstly, you can fire up your favorite IDE and create a Python project and then
+you need to install the PyFlink package. Please
+see [Build PyFlink]({{ site.baseurl }}/flinkDev/building.html#build-pyflink)
+for more details about this.
+
+## Writing a Flink Python Table API Program
+
+The first step in a Flink Python Table API program is to create a 
`BatchTableEnvironment`
+(or `StreamTableEnvironment` if you are writing a streaming job). It is the 
main entry point
+for Python Table API jobs.
+
+{% highlight python %}
+exec_env = ExecutionEnvironment.get_execution_environment()
+exec_env.set_parallelism(1)
+t_config = TableConfig()
+t_env = BatchTableEnvironment.create(exec_env, t_config)
+{% endhighlight %}
+
+The `ExecutionEnvironment` (or `StreamExecutionEnvironment` if you are writing 
a streaming job)
+can be used to set execution parameters, such as the restart strategy, default 
parallelism, etc.
+
+The `TableConfig` can be used by setting the parameters such as the built-in 
catalog name, the
+threshold where generating code, etc.
+
+Next we will create a source table and a sink table.
+
+{% highlight python %}
+t_env.connect(FileSystem().path('/tmp/input')) \
+.with_format(OldCsv()
+ .line_delimiter(' ')
+ .field('word', DataTypes.STRING())) \
+.with_schema(Schema()
+ .field('word', DataTypes.STRING())) \
+.register_table_source('mySource')
+
+t_env.connect(FileSystem().path('/tmp/output')) \
+.with_format(OldCsv()
+ .field_delimiter('\t')
+ .field('word', DataTypes.STRING())
+ .field('count', DataTypes.BIGINT())) \
+.with_schema(Schema()
+ .field('word', DataTypes.STRING())
+ .field('count', DataTypes.BIGINT())) \
+.register_table_sink('mySink')
+{% endhighlight %}
+
+This registers a table named `mySource` and a table named `mySink` in the
+`ExecutionEnvironment`. The table `mySource` has only one column: word.
+It represents the words read from file `/tmp/input`. The table `mySink` has 
two columns:
+word and count. It writes data to file `/tmp/output`, with `\t` as the field 
delimiter.
+
+Then we need to create a job which reads input from table `mySource`, preforms 
some
+operations and writes the results to table `mySink`.
+
+{% highlight python %}
+t_env.scan('mySource') \
+.group_by('word') \
+.select('word, count(1)') \
+.insert_into('mySink')
+{% endhighlight %}
+
+The last thing is to start the actual Flink Python Table API job. All 
operations, such as
+creating sources, transformations and sinks only build up a graph of internal 
operations.
+Only when `exec_env.execute()` is called, this graph of operations will be 
thrown on a cluster or
+executed on your local machine.
+
+{% highlight python %}
+exec_env.execute()
+{% endhighlight %}
+
+The complete code so far is as follows:
+
+{% highlight python %}
+from pyflink.dataset import ExecutionEnvironment
+from pyflink.table import TableConfig, DataTypes, BatchTableEnvironment
+from pyflink.table.descriptors import Schema, OldCsv, FileSystem
+
+exec_env = ExecutionEnvironment.get_execution_environment()
+exec_env.set_parallelism(1)
+t_config = TableConfig()
+t_env = BatchTableEnvironment.create(exec_env, t_config)
+
+t_env.connect(FileSystem().path('/tmp/input')) \
+.with_format(OldCsv()
+ .line_delimiter(' ')
+ .field('word', DataTypes.STRING())) \
+.with_schema(Schema()
+ .field('word', DataTypes.STRING())) \
+.register_table_source('mySource')
+
+t_env.connect(FileSystem().path('/tmp/output')) \
+.with_format(OldCsv()
+ .field_del

[flink] branch master updated: [FLINK-12609][python] Add LocalZonedTimestampType/ZonedTimestampType/DayTimeIntervalType/YearMonthIntervalType

2019-06-26 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 0ed394e  [FLINK-12609][python] Add 
LocalZonedTimestampType/ZonedTimestampType/DayTimeIntervalType/YearMonthIntervalType
0ed394e is described below

commit 0ed394ef0908a88502d198a90f49e04b123a4eda
Author: Dian Fu 
AuthorDate: Thu Jun 20 19:27:00 2019 +0800

[FLINK-12609][python] Add 
LocalZonedTimestampType/ZonedTimestampType/DayTimeIntervalType/YearMonthIntervalType

This closes #8847
---
 flink-python/pyflink/table/table_environment.py|   4 +-
 flink-python/pyflink/table/tests/test_calc.py  |  32 +-
 flink-python/pyflink/table/tests/test_types.py |  44 +-
 flink-python/pyflink/table/types.py| 626 +++--
 flink-python/setup.py  |   2 +-
 .../flink/table/util/python/PythonTableUtils.scala |   5 +
 6 files changed, 663 insertions(+), 50 deletions(-)

diff --git a/flink-python/pyflink/table/table_environment.py 
b/flink-python/pyflink/table/table_environment.py
index 4932861..bfb7f32 100644
--- a/flink-python/pyflink/table/table_environment.py
+++ b/flink-python/pyflink/table/table_environment.py
@@ -494,9 +494,11 @@ class TableEnvironment(object):
 raise TypeError(
 "schema should be RowType, list, tuple or None, but got: %s" % 
schema)
 
+# verifies the elements against the specified schema
+elements = map(verify_obj, elements)
 # converts python data to sql data
 elements = [schema.to_sql_type(element) for element in elements]
-return self._from_elements(map(verify_obj, elements), schema)
+return self._from_elements(elements, schema)
 
 def _from_elements(self, elements, schema):
 """
diff --git a/flink-python/pyflink/table/tests/test_calc.py 
b/flink-python/pyflink/table/tests/test_calc.py
index edf430f..3e13450 100644
--- a/flink-python/pyflink/table/tests/test_calc.py
+++ b/flink-python/pyflink/table/tests/test_calc.py
@@ -63,20 +63,14 @@ class StreamTableCalcTests(PyFlinkStreamTableTestCase):
 
 def test_from_element(self):
 t_env = self.t_env
-a = array.array('b')
-a.fromstring('ABCD')
-t = t_env.from_elements(
-[(1, 1.0, "hi", "hello", datetime.date(1970, 1, 2), 
datetime.time(1, 0, 0),
- datetime.datetime(1970, 1, 2, 0, 0), [1.0, None], 
array.array("d", [1.0, 2.0]),
- ["abc"], [datetime.date(1970, 1, 2)], Decimal(1), Row("a", 
"b")(1, 2.0),
- {"key": 1.0}, a, ExamplePoint(1.0, 2.0),
- PythonOnlyPoint(3.0, 4.0))])
 field_names = ["a", "b", "c", "d", "e", "f", "g", "h",
-   "i", "j", "k", "l", "m", "n", "o", "p", "q"]
+   "i", "j", "k", "l", "m", "n", "o", "p", "q", "r", "s"]
 field_types = [DataTypes.BIGINT(), DataTypes.DOUBLE(), 
DataTypes.STRING(),
DataTypes.STRING(), DataTypes.DATE(),
DataTypes.TIME(),
DataTypes.TIMESTAMP(),
+   DataTypes.TIMESTAMP_WITH_LOCAL_TIME_ZONE(),
+   DataTypes.INTERVAL(DataTypes.DAY(), DataTypes.SECOND()),
DataTypes.ARRAY(DataTypes.DOUBLE()),
DataTypes.ARRAY(DataTypes.DOUBLE(False)),
DataTypes.ARRAY(DataTypes.STRING()),
@@ -87,16 +81,28 @@ class StreamTableCalcTests(PyFlinkStreamTableTestCase):
DataTypes.MAP(DataTypes.STRING(), DataTypes.DOUBLE()),
DataTypes.BYTES(), ExamplePointUDT(),
PythonOnlyUDT()]
+schema = DataTypes.ROW(
+list(map(lambda field_name, field_type: 
DataTypes.FIELD(field_name, field_type),
+ field_names,
+ field_types)))
 table_sink = source_sink_utils.TestAppendSink(field_names, field_types)
 t_env.register_table_sink("Results", table_sink)
-
+t = t_env.from_elements(
+[(1, 1.0, "hi", "hello", datetime.date(1970, 1, 2), 
datetime.time(1, 0, 0),
+  datetime.datetime(1970, 1, 2, 0, 0), datetime.datetime(1970, 1, 
2, 0, 0),
+  datetime.timedelta(days=1, microseconds=10),
+  [1.0, None], array.array("d", [1.0, 2.0]),
+  ["abc"], [datetime.date(1970, 1, 2)], Decimal(1), Row("a"

[flink] branch master updated: [FLINK-12990][python] Fix LocalTimeZone issue for Date type.

2019-06-26 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 731c38b  [FLINK-12990][python] Fix LocalTimeZone issue for Date type.
731c38b is described below

commit 731c38baf1e72f963db1c067ada20dc23e9fc23c
Author: Dian Fu 
AuthorDate: Mon Jun 24 15:10:50 2019 +0800

[FLINK-12990][python] Fix LocalTimeZone issue for Date type.

This closes #8892
---
 .../flink/table/util/python/PythonTableUtils.scala | 29 ++-
 .../table/util/python/PythonTableUtilsTest.scala   | 56 ++
 2 files changed, 84 insertions(+), 1 deletion(-)

diff --git 
a/flink-table/flink-table-planner/src/main/scala/org/apache/flink/table/util/python/PythonTableUtils.scala
 
b/flink-table/flink-table-planner/src/main/scala/org/apache/flink/table/util/python/PythonTableUtils.scala
index e56b3d2..73524f0 100644
--- 
a/flink-table/flink-table-planner/src/main/scala/org/apache/flink/table/util/python/PythonTableUtils.scala
+++ 
b/flink-table/flink-table-planner/src/main/scala/org/apache/flink/table/util/python/PythonTableUtils.scala
@@ -20,6 +20,8 @@ package org.apache.flink.table.util.python
 
 import java.nio.charset.StandardCharsets
 import java.sql.{Date, Time, Timestamp}
+import java.time.{LocalDate, LocalDateTime, LocalTime}
+import java.util.TimeZone
 import java.util.function.BiConsumer
 
 import org.apache.flink.api.common.functions.MapFunction
@@ -131,7 +133,10 @@ object PythonTableUtils {
 }
 
 case _ if dataType == Types.SQL_DATE => (obj: Any) => nullSafeConvert(obj) 
{
-  case c: Int => new Date(c * 8640)
+  case c: Int =>
+val millisLocal = c.toLong * 8640
+val millisUtc = millisLocal - getOffsetFromLocalMillis(millisLocal)
+new Date(millisUtc)
 }
 
 case _ if dataType == Types.SQL_TIME => (obj: Any) => nullSafeConvert(obj) 
{
@@ -389,4 +394,26 @@ object PythonTableUtils {
 array
 }
   }
+
+  def getOffsetFromLocalMillis(millisLocal: Long): Int = {
+val localZone = TimeZone.getDefault
+var result = localZone.getRawOffset
+// the actual offset should be calculated based on milliseconds in UTC
+val offset = localZone.getOffset(millisLocal - result)
+if (offset != result) {
+  // DayLight Saving Time
+  result = localZone.getOffset(millisLocal - offset)
+  if (result != offset) {
+// fallback to do the reverse lookup using java.time.LocalDateTime
+// this should only happen near the start or end of DST
+val localDate = LocalDate.ofEpochDay(millisLocal / 8640)
+val localTime = LocalTime.ofNanoOfDay(
+  Math.floorMod(millisLocal, 8640) * 1000 * 1000)
+val localDateTime = LocalDateTime.of(localDate, localTime)
+val millisEpoch = 
localDateTime.atZone(localZone.toZoneId).toInstant.toEpochMilli
+result = (millisLocal - millisEpoch).toInt
+  }
+}
+result
+  }
 }
diff --git 
a/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/util/python/PythonTableUtilsTest.scala
 
b/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/util/python/PythonTableUtilsTest.scala
new file mode 100644
index 000..822e748
--- /dev/null
+++ 
b/flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/util/python/PythonTableUtilsTest.scala
@@ -0,0 +1,56 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.util.python
+
+import java.time.ZoneId
+import java.util.TimeZone
+
+import org.apache.calcite.avatica.util.DateTimeUtils
+import org.junit.Test
+import org.junit.Assert.assertEquals
+
+class PythonTableUtilsTest {
+
+  @Test
+  def testGetOffsetFromLocalMillis(): Unit = {
+def testOffset(localMillis: Long, expected: Long): Unit = {
+  assertEquals(expected, 
PythonTableUtils.getOffsetFromLocalMillis(localMillis))
+}
+
+val originalZone = TimeZone.getDefault
+try {
+  // Daylight Saving Time Test
+  TimeZone.setDefault(TimeZone.getTimeZ

[flink] branch master updated: [hotfix][python] Align the signature of type utility methods with Java

2019-06-26 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new ef3e201  [hotfix][python] Align the signature of type utility methods 
with Java
ef3e201 is described below

commit ef3e201e180aa943ee25c645622aa4b5d7bdaed8
Author: Dian Fu 
AuthorDate: Wed Jun 19 20:36:05 2019 +0800

[hotfix][python] Align the signature of type utility methods with Java

This closes #8893
---
 flink-python/pyflink/table/tests/test_calc.py  |   6 +-
 flink-python/pyflink/table/tests/test_types.py | 160 +-
 flink-python/pyflink/table/types.py| 391 ++---
 3 files changed, 359 insertions(+), 198 deletions(-)

diff --git a/flink-python/pyflink/table/tests/test_calc.py 
b/flink-python/pyflink/table/tests/test_calc.py
index 147b3c2..edf430f 100644
--- a/flink-python/pyflink/table/tests/test_calc.py
+++ b/flink-python/pyflink/table/tests/test_calc.py
@@ -81,11 +81,11 @@ class StreamTableCalcTests(PyFlinkStreamTableTestCase):
DataTypes.ARRAY(DataTypes.DOUBLE(False)),
DataTypes.ARRAY(DataTypes.STRING()),
DataTypes.ARRAY(DataTypes.DATE()),
-   DataTypes.DECIMAL(),
+   DataTypes.DECIMAL(10, 0),
DataTypes.ROW([DataTypes.FIELD("a", DataTypes.BIGINT()),
   DataTypes.FIELD("b", 
DataTypes.DOUBLE())]),
-   DataTypes.MAP(DataTypes.VARCHAR(), DataTypes.DOUBLE()),
-   DataTypes.VARBINARY(), ExamplePointUDT(),
+   DataTypes.MAP(DataTypes.STRING(), DataTypes.DOUBLE()),
+   DataTypes.BYTES(), ExamplePointUDT(),
PythonOnlyUDT()]
 table_sink = source_sink_utils.TestAppendSink(field_names, field_types)
 t_env.register_table_sink("Results", table_sink)
diff --git a/flink-python/pyflink/table/tests/test_types.py 
b/flink-python/pyflink/table/tests/test_types.py
index ed4f19f..4888583 100644
--- a/flink-python/pyflink/table/tests/test_types.py
+++ b/flink-python/pyflink/table/tests/test_types.py
@@ -30,7 +30,7 @@ from pyflink.table.types import (_infer_schema_from_data, 
_infer_type,
  _array_type_mappings, _merge_type,
  _create_type_verifier, UserDefinedType, 
DataTypes, Row, RowField,
  RowType, ArrayType, BigIntType, VarCharType, 
MapType, DataType,
- _to_java_type, _from_java_type, TimestampKind)
+ _to_java_type, _from_java_type)
 
 
 class ExamplePointUDT(UserDefinedType):
@@ -145,7 +145,7 @@ class TypesTests(unittest.TestCase):
 'VarCharType(2147483647, true)',
 'DateType(true)',
 'TimeType(0, true)',
-'TimestampType(0, 6, true)',
+'TimestampType(6, true)',
 'DoubleType(true)',
 "ArrayType(DoubleType(false), true)",
 "ArrayType(BigIntType(true), true)",
@@ -242,46 +242,46 @@ class TypesTests(unittest.TestCase):
 self.assertEqual(expected_schema, _infer_type(p))
 
 def test_struct_type(self):
-row1 = DataTypes.ROW().add("f1", DataTypes.VARCHAR(nullable=True)) \
-.add("f2", DataTypes.VARCHAR(nullable=True))
-row2 = DataTypes.ROW([DataTypes.FIELD("f1", 
DataTypes.VARCHAR(nullable=True)),
-  DataTypes.FIELD("f2", 
DataTypes.VARCHAR(nullable=True), None)])
+row1 = DataTypes.ROW().add("f1", DataTypes.STRING(nullable=True)) \
+.add("f2", DataTypes.STRING(nullable=True))
+row2 = DataTypes.ROW([DataTypes.FIELD("f1", 
DataTypes.STRING(nullable=True)),
+  DataTypes.FIELD("f2", 
DataTypes.STRING(nullable=True), None)])
 self.assertEqual(row1.field_names(), row2.names)
 self.assertEqual(row1, row2)
 
-row1 = DataTypes.ROW().add("f1", DataTypes.VARCHAR(nullable=True)) \
-.add("f2", DataTypes.VARCHAR(nullable=True))
-row2 = DataTypes.ROW([DataTypes.FIELD("f1", 
DataTypes.VARCHAR(nullable=True))])
+row1 = DataTypes.ROW().add("f1", DataTypes.STRING(nullable=True)) \
+.add("f2", DataTypes.STRING(nullable=True))
+row2 = DataTypes.ROW([DataTypes.FIELD("f1", 
DataTypes.STRING(nullable=True))])
 self.assertNotEqual(row1.field_names(), row2.names)
 self.assertNotEqual(row1, row2)
 
-row1 = (DataTypes.ROW().add(DataTypes.FIELD("f1", 
DataTypes.VARCHAR(nullable=True)))
- 

[flink] branch master updated: [hotfix][python][docs] Fix python doc nav bar not showing and layout issue.

2019-06-25 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new a65c7a1  [hotfix][python][docs] Fix python doc nav bar not showing and 
layout issue.
a65c7a1 is described below

commit a65c7a1730c6e42e13c0d4c22df10758c6c03ee8
Author: Wei Zhong 
AuthorDate: Fri Jun 21 19:50:38 2019 +0800

[hotfix][python][docs] Fix python doc nav bar not showing and layout issue.

This closes #8825
---
 flink-python/docs/_static/pyflink.css| 28 
 flink-python/docs/_templates/layout.html |  6 --
 2 files changed, 32 insertions(+), 2 deletions(-)

diff --git a/flink-python/docs/_static/pyflink.css 
b/flink-python/docs/_static/pyflink.css
new file mode 100644
index 000..112423c
--- /dev/null
+++ b/flink-python/docs/_static/pyflink.css
@@ -0,0 +1,28 @@
+/*
+ Licensed to the Apache Software Foundation (ASF) under one or more
+ contributor license agreements.  See the NOTICE file distributed with
+ this work for additional information regarding copyright ownership.
+ The ASF licenses this file to You under the Apache License, Version 2.0
+ (the "License"); you may not use this file except in compliance with
+ the License.  You may obtain a copy of the License at
+
+http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+*/
+
+div div.sphinxsidebar {
+width: 274px;
+}
+
+div div.bodywrapper {
+margin: 0 0 0 274px;
+}
+
+div div.body {
+max-width: none;
+}
diff --git a/flink-python/docs/_templates/layout.html 
b/flink-python/docs/_templates/layout.html
index 68848df..796d061 100644
--- a/flink-python/docs/_templates/layout.html
+++ b/flink-python/docs/_templates/layout.html
@@ -17,6 +17,8 @@ specific language governing permissions and limitations
 under the License.
 -->
 {% extends "!layout.html" %}
-{% set script_files = script_files + ["_static/pyflink.js"] %}
-{% block rootrellink %}
+{% block linktags %}
+{{ super() }}
+
+
 {% endblock %}



svn commit: r34622 - /dev/flink/flink-1.8.1-rc1/

2019-06-25 Thread jincheng
Author: jincheng
Date: Tue Jun 25 08:27:18 2019
New Revision: 34622

Log:
flink-1.8.1 release

Added:
dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.11.tgz   (with props)
dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.11.tgz.asc
dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.11.tgz.sha512
dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.12.tgz.asc
dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.12.tgz.sha512
Modified:
dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.12.tgz

Added: dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.11.tgz
==
Binary file - no diff available.

Propchange: dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.11.tgz
--
svn:mime-type = application/octet-stream

Added: dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.11.tgz.asc
==
--- dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.11.tgz.asc (added)
+++ dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.11.tgz.asc Tue Jun 25 
08:27:18 2019
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIzBAABCAAdFiEEj+oe6dAEjAzMcLdXMhGwcDt56g4FAl0R1vUACgkQMhGwcDt5
+6g4/2A/+Phn6gyCyM8fkbkQzqdXIGkyPcyUdTAWcixs3+HEX2PrMX0BptYKs1ZJy
+SqpkhY8mevQvFGDZ/x9HP80k+aLkykTK1JR0WyS1Pz9NrQqapLZQrK3BCIk2vbr0
+4kV3tv2943+iYkslnFoyfLEleZt8q7aOYUXX8mNfqoIZ7az0Lh/zMU+09i7evQKz
+/UhtpO0LO97XgcTH/d4t8Pucz8GyGLUsnsnJt5kxucCx6s38q+74qkp9ZaPkQFxF
+RR9Vn8ztKz3Pik8iUTWY6g60Ba+TbC5PqGkF6ayPpl4HasV2NkQjJcDWjugL2b1S
+lL7CTkWfZnMqvGyJ576RpEameu9mGBj/1vEbT0GPNCUzqJGZ4tkvpJ7uFgGmttiy
++1U0o3ZLZCzxSqcOLrG9eukOcXyL+T8pbs5RhJ+Qc0nWtaM1fiFye5vtsPyAE1uZ
++S7QaZZ8Zax+KKUok0r5ED/m/Bgh7cvZzky/eFTHIjG/cy4mvdBv8gH/IKHPj1kY
+daDhaBPK2g3uGBVT+W7HeEH1qZm5o5258eLBvAnNMQyPs1/G9lHDCys8TY5bER4l
+nU1nYgIY3SBUxW2LDtKlosDgGjy6u/5OZTrFXIuHfav3z5E1wn/4vChSLS1LjfDH
+oqjb2JFComAiFGx2NR7Rqe7ARaJEdlYE4KMqiAiZkjOyhJr3nuo=
+=UOE9
+-END PGP SIGNATURE-

Added: dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.11.tgz.sha512
==
--- dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.11.tgz.sha512 (added)
+++ dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.11.tgz.sha512 Tue Jun 25 
08:27:18 2019
@@ -0,0 +1 @@
+bebdf73475c305699ee29f295535cfd2358251fd159f30da9895eed9f0736a4eee8dac6094368fbdcc5c4d296c49614c46c97d1fba44aa4dca318a9005d0
  flink-1.8.1-bin-scala_2.11.tgz

Modified: dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.12.tgz
==
Binary files - no diff available.

Added: dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.12.tgz.asc
==
--- dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.12.tgz.asc (added)
+++ dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.12.tgz.asc Tue Jun 25 
08:27:18 2019
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIzBAABCAAdFiEEj+oe6dAEjAzMcLdXMhGwcDt56g4FAl0R1KwACgkQMhGwcDt5
+6g6cAg//Tl6CKQ5HPcpTLHlXvcQik3pYwPNZOGj1mzmTizVaMi9qHzLUAoVBgNlj
+imEhUOcqLSEmkmJ4gbgwy2wbhGgnfKDdxvDJR/5KEr3eArU9N7YrGT5Rl3Ai5vPJ
+RPsSI94P3rOZtrMPYOgWOWV3fORKIA8OmmeOkUti/QSAsszBXq/KBnqjYNoC8yig
+jIr0RMXcdXksaHf1np8c2qSxl25irJAAeOgmQ1SHIuPyKiagV39vxUPz3RR6XJbE
+pCXibAp1H/nsrApdwA+ZbWKvUUO38BSQfbDaqYrTMMl2H0fuE8VwvX6UPrc0CPU7
+0219cOVvRX9G7L8gNGNBb3lW0Y90UcnxEq/Os1iGug13l7KQUHY32yLh67VKWSgU
+m44/bhF/IvtKgMbth0u/o1ayIcruPWeOLuNUxe/f6rcix9/XSeHwCs3MRjdgNCq1
+v4h6A4sNpmDIGp2IV9u8TRO33sHS++sY6KsXmhBGGYU9q2ZawbPrdmPBFv2FS6gr
+B/kxiYnw6+/AB4TEGYcC3J5wYOzOzulmqaHX+JLzluafpjADaj6hJ4B/HnHcUyJC
+wFaXtzXnGlaVrnzJXIQ+Jglr6GWeJ9bIOKwf/H4udOe7oGUPA85rz+r77m+sc/5u
+d25r8NN9eZbgGItYZGFgqiI7tnVogvViU9yxEJmUVgTBzM64iig=
+=CwHu
+-END PGP SIGNATURE-

Added: dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.12.tgz.sha512
==
--- dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.12.tgz.sha512 (added)
+++ dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.12.tgz.sha512 Tue Jun 25 
08:27:18 2019
@@ -0,0 +1 @@
+c5330f2b7dc61fbc58870b74f089c33d34409dc6cabef349b8647f5e8b8829a90954d228c57ea655cd1afa80be36a60cadc1a675a4655ae48f158adccf599a83
  flink-1.8.1-bin-scala_2.12.tgz




svn commit: r34621 - in /dev/flink/flink-1.8.1-rc1: ./ flink-1.8.1-bin-scala_2.12.tgz flink-1.8.1-src.tgz flink-1.8.1-src.tgz.asc flink-1.8.1-src.tgz.sha512

2019-06-25 Thread jincheng
Author: jincheng
Date: Tue Jun 25 06:35:23 2019
New Revision: 34621

Log:
flink-1.8.1 release

Added:
dev/flink/flink-1.8.1-rc1/
dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.12.tgz   (with props)
dev/flink/flink-1.8.1-rc1/flink-1.8.1-src.tgz   (with props)
dev/flink/flink-1.8.1-rc1/flink-1.8.1-src.tgz.asc
dev/flink/flink-1.8.1-rc1/flink-1.8.1-src.tgz.sha512

Added: dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.12.tgz
==
Binary file - no diff available.

Propchange: dev/flink/flink-1.8.1-rc1/flink-1.8.1-bin-scala_2.12.tgz
--
svn:mime-type = application/octet-stream

Added: dev/flink/flink-1.8.1-rc1/flink-1.8.1-src.tgz
==
Binary file - no diff available.

Propchange: dev/flink/flink-1.8.1-rc1/flink-1.8.1-src.tgz
--
svn:mime-type = application/octet-stream

Added: dev/flink/flink-1.8.1-rc1/flink-1.8.1-src.tgz.asc
==
--- dev/flink/flink-1.8.1-rc1/flink-1.8.1-src.tgz.asc (added)
+++ dev/flink/flink-1.8.1-rc1/flink-1.8.1-src.tgz.asc Tue Jun 25 06:35:23 2019
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIzBAABCAAdFiEEj+oe6dAEjAzMcLdXMhGwcDt56g4FAl0Q5qQACgkQMhGwcDt5
+6g7OnA/+McKBdRdOQZNjNup38z4PX2EiVVq5IfnzzjSeQTYVIpYsXA/w142sRyxB
+xaQC3iET+QqgHzF/Rb5eTy2zG/IuPyRIsl0+TMZIDF1VF9PguW62WeL3J/gjv+E5
+PjvnBRc1obBO6XOUSuhvjRk8ve0hE+pFl6rCWkifPLC9gErdYFQBNyjvb5JeUCGl
++rOVeG2WdiH01qqARknzOSPyA30lvIb7fzgrnEXs78HfI2PlniGXJ8xVpvBqGjnZ
+nWvb0R4iCJkf30P/tdL+5Luh+xv4LM+E6zJslytNw8eVa7pMJ9wnxp8kbq6jZGU+
+FX4QuHufB09PrmZ6KxJpjzU6l+VGznO5U3FOOzdMYK5/mNe1x8ZVXbEmc4sL4gqK
+CQgPKefrq/YvL5v/AOfnu6VS6z4zZO5O69fupXi9z7g4uGG3DEy2Hcl2/MnaI8Lc
+VexJXPkouLToet9lyXiB2odOMDsjzJftHsUSF5j+7WBg+yZoXpWYZFJ7i6gksUwo
+UfGP7Rh+NHEdGDHn4wX3bVXdDCg/bnTHsa9CLh7OjGqnA//XXF1foiGrVl7xjObf
+4oCs0gHluRElgJfJjW24OP28AogcO4YZnXIdbcOOZYA5FB1eAPxtv4gj2QOWxVwn
+uemFyNfFpDgLPdAV9CNh+6HofgcneZJjEA+thDSZgJbh26wYTTk=
+=xXPa
+-END PGP SIGNATURE-

Added: dev/flink/flink-1.8.1-rc1/flink-1.8.1-src.tgz.sha512
==
--- dev/flink/flink-1.8.1-rc1/flink-1.8.1-src.tgz.sha512 (added)
+++ dev/flink/flink-1.8.1-rc1/flink-1.8.1-src.tgz.sha512 Tue Jun 25 06:35:23 
2019
@@ -0,0 +1 @@
+dcd57a0d04d7dc36d87fb5d98a8ac982564db9a683f86eb4be77325e74c6972fe134a64d5c3e711e4062079820937b01e45e02315e00294041652cc92be2c109
  flink-1.8.1-src.tgz




[flink] branch master updated: [FLINK-12920][python] Drop support of register_table_sink with parameters field_names and field_types

2019-06-24 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 5d8054b  [FLINK-12920][python] Drop support of register_table_sink 
with parameters field_names and field_types
5d8054b is described below

commit 5d8054be418cafaff9753e85724aa8bff21574da
Author: Dian Fu 
AuthorDate: Fri Jun 21 10:05:23 2019 +0800

[FLINK-12920][python] Drop support of register_table_sink with parameters 
field_names and field_types

This closes #8817
---
 .../dataset/tests/test_execution_environment.py|  2 +-
 .../tests/test_stream_execution_environment.py |  2 +-
 .../pyflink/table/examples/batch/word_count.py |  3 ++-
 flink-python/pyflink/table/sinks.py| 13 ++--
 flink-python/pyflink/table/table_environment.py| 13 +++-
 flink-python/pyflink/table/tests/test_calc.py  |  5 ++---
 .../table/tests/test_table_environment_api.py  | 12 +--
 flink-python/pyflink/testing/source_sink_utils.py  | 24 +++---
 8 files changed, 43 insertions(+), 31 deletions(-)

diff --git a/flink-python/pyflink/dataset/tests/test_execution_environment.py 
b/flink-python/pyflink/dataset/tests/test_execution_environment.py
index 8f41636..7adac00 100644
--- a/flink-python/pyflink/dataset/tests/test_execution_environment.py
+++ b/flink-python/pyflink/dataset/tests/test_execution_environment.py
@@ -107,7 +107,7 @@ class ExecutionEnvironmentTests(PyFlinkTestCase):
 t_env.register_table_source("Orders", csv_source)
 t_env.register_table_sink(
 "Results",
-field_names, field_types, CsvTableSink(tmp_csv))
+CsvTableSink(field_names, field_types, tmp_csv))
 t_env.scan("Orders").insert_into("Results")
 
 plan = t_env.exec_env().get_execution_plan()
diff --git 
a/flink-python/pyflink/datastream/tests/test_stream_execution_environment.py 
b/flink-python/pyflink/datastream/tests/test_stream_execution_environment.py
index d37ecae..43768fa 100644
--- a/flink-python/pyflink/datastream/tests/test_stream_execution_environment.py
+++ b/flink-python/pyflink/datastream/tests/test_stream_execution_environment.py
@@ -184,7 +184,7 @@ class StreamExecutionEnvironmentTests(PyFlinkTestCase):
 t_env.register_table_source("Orders", csv_source)
 t_env.register_table_sink(
 "Results",
-field_names, field_types, CsvTableSink(tmp_csv))
+CsvTableSink(field_names, field_types, tmp_csv))
 t_env.scan("Orders").insert_into("Results")
 
 plan = t_env.exec_env().get_execution_plan()
diff --git a/flink-python/pyflink/table/examples/batch/word_count.py 
b/flink-python/pyflink/table/examples/batch/word_count.py
index 4a7c7f2..721e002 100644
--- a/flink-python/pyflink/table/examples/batch/word_count.py
+++ b/flink-python/pyflink/table/examples/batch/word_count.py
@@ -21,7 +21,8 @@ import shutil
 import sys
 import tempfile
 
-from pyflink.table import TableEnvironment, TableConfig, FileSystem, OldCsv, 
Schema
+from pyflink.table import TableConfig, TableEnvironment
+from pyflink.table.descriptors import FileSystem, OldCsv, Schema
 from pyflink.table.types import DataTypes
 
 
diff --git a/flink-python/pyflink/table/sinks.py 
b/flink-python/pyflink/table/sinks.py
index 14237d9..9722602 100644
--- a/flink-python/pyflink/table/sinks.py
+++ b/flink-python/pyflink/table/sinks.py
@@ -17,6 +17,8 @@
 

 
 from pyflink.java_gateway import get_gateway
+from pyflink.table.types import _to_java_type, DataType
+from pyflink.util import utils
 
 __all__ = ['TableSink', 'CsvTableSink']
 
@@ -39,14 +41,17 @@ class CsvTableSink(TableSink):
 """
 A simple :class:`TableSink` to emit data as CSV files.
 
+:param field_names: The list of field names.
+:param field_types: The list of field data types.
 :param path: The output path to write the Table to.
 :param field_delimiter: The field delimiter.
 :param num_files: The number of files to write to.
 :param write_mode: The write mode to specify whether existing files are 
overwritten or not.
 """
 
-def __init__(self, path, field_delimiter=',', num_files=1, 
write_mode=None):
-# type: (str, str, int, int) -> None
+def __init__(self, field_names, field_types, path, field_delimiter=',', 
num_files=1,
+ write_mode=None):
+# type: (list[str], list[DataType], str, str, int, int) -> None
 gateway = get_gateway()
 if write_mode == WriteMode.NO_OVERWRITE:
 j_write_mode = gateway.jvm.scala.Option.apply(
@@ -62,4 +67,8 @@ class CsvTableSink(TableSink):
  

[flink] 02/02: [hotfix][python]Use of a variable like $X instead of ${X}

2019-06-24 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 8d98223282e8479964f13d92c5e0a591c1a72c71
Author: sunjincheng121 
AuthorDate: Tue Jun 25 08:39:41 2019 +0800

[hotfix][python]Use of a variable like $X instead of ${X}
---
 flink-python/dev/lint-python.sh | 30 +++---
 1 file changed, 15 insertions(+), 15 deletions(-)

diff --git a/flink-python/dev/lint-python.sh b/flink-python/dev/lint-python.sh
index e681589..66de146 100755
--- a/flink-python/dev/lint-python.sh
+++ b/flink-python/dev/lint-python.sh
@@ -190,7 +190,7 @@ function install_py_env() {
 fi
 fi
 print_function "STEP" "installing python${py_env[i]}..."
-${CONDA_PATH} create --name ${py_env[i]} -y -q python=${py_env[i]} 
2>&1 >/dev/null
+$CONDA_PATH create --name ${py_env[i]} -y -q python=${py_env[i]} 2>&1 
>/dev/null
 if [ $? -ne 0 ]; then
 echo "conda install ${py_env[i]} failed.\
 You can retry to exec the script."
@@ -204,7 +204,7 @@ function install_py_env() {
 # In some situations,you need to run the script with "sudo". e.g. sudo 
./lint-python.sh
 function install_tox() {
 if [ -f "$TOX_PATH" ]; then
-${CONDA_PATH} remove -p ${CONDA_HOME} tox -y -q 2>&1 >/dev/null
+$CONDA_PATH remove -p $CONDA_HOME tox -y -q 2>&1 >/dev/null
 if [ $? -ne 0 ]; then
 echo "conda remove tox failed \
 please try to exec the script again.\
@@ -213,7 +213,7 @@ function install_tox() {
 fi
 fi
 
-${CONDA_PATH} install -p ${CONDA_HOME} -c conda-forge tox -y -q 2>&1 
>/dev/null
+$CONDA_PATH install -p $CONDA_HOME -c conda-forge tox -y -q 2>&1 >/dev/null
 if [ $? -ne 0 ]; then
 echo "conda install tox failed \
 please try to exec the script again.\
@@ -226,7 +226,7 @@ function install_tox() {
 # In some situations,you need to run the script with "sudo". e.g. sudo 
./lint-python.sh
 function install_flake8() {
 if [ -f "$FLAKE8_PATH" ]; then
-${CONDA_PATH} remove -p ${CONDA_HOME} flake8 -y -q 2>&1 >/dev/null
+$CONDA_PATH remove -p $CONDA_HOME flake8 -y -q 2>&1 >/dev/null
 if [ $? -ne 0 ]; then
 echo "conda remove flake8 failed \
 please try to exec the script again.\
@@ -235,7 +235,7 @@ function install_flake8() {
 fi
 fi
 
-${CONDA_PATH} install -p ${CONDA_HOME} -c anaconda flake8 -y -q 2>&1 
>/dev/null
+$CONDA_PATH install -p $CONDA_HOME -c anaconda flake8 -y -q 2>&1 >/dev/null
 if [ $? -ne 0 ]; then
 echo "conda install flake8 failed \
 please try to exec the script again.\
@@ -248,7 +248,7 @@ function install_flake8() {
 # In some situations,you need to run the script with "sudo". e.g. sudo 
./lint-python.sh
 function install_sphinx() {
 if [ -f "$SPHINX_PATH" ]; then
-${CONDA_PATH} remove -p ${CONDA_HOME} sphinx -y -q 2>&1 >/dev/null
+$CONDA_PATH remove -p $CONDA_HOME sphinx -y -q 2>&1 >/dev/null
 if [ $? -ne 0 ]; then
 echo "conda remove sphinx failed \
 please try to exec the script again.\
@@ -257,7 +257,7 @@ function install_sphinx() {
 fi
 fi
 
-${CONDA_PATH} install -p ${CONDA_HOME} -c anaconda sphinx -y -q 2>&1 
>/dev/null
+$CONDA_PATH install -p $CONDA_HOME -c anaconda sphinx -y -q 2>&1 >/dev/null
 if [ $? -ne 0 ]; then
 echo "conda install sphinx failed \
 please try to exec the script again.\
@@ -352,13 +352,13 @@ function create_dir() {
 
 # Set created py-env in $PATH for tox's creating virtual env
 function activate () {
-if [ ! -d ${CURRENT_DIR}/.conda/envs ]; then
-echo "For some unkown reasons,missing the directory 
${CURRENT_DIR}/.conda/envs,\
+if [ ! -d $CURRENT_DIR/.conda/envs ]; then
+echo "For some unkown reasons,missing the directory 
$CURRENT_DIR/.conda/envs,\
 you should exec the script with the option: -f"
 exit 1
 fi
 
-for py_dir in ${CURRENT_DIR}/.conda/envs/*
+for py_dir in $CURRENT_DIR/.conda/envs/*
 do
 PATH=$py_dir/bin:$PATH
 done
@@ -533,16 +533,16 @@ pushd "$FLINK_PYTHON_DIR" &> /dev/null
 CONDA_HOME=$CURRENT_DIR/.conda
 
 # conda path
-CONDA_PATH=$CURRENT_DIR/.conda/bin/conda
+CONDA_PATH=$CONDA_HOME/bin/conda
 
 # tox path
-TOX_PATH=$CURRENT_DIR/.conda/bin/tox
+TOX_PATH=$CONDA_HOME/bin/tox
 
 # flake8 path
-FLAKE8_PATH=$CURRENT_DIR/.conda/bin/flake8
+FLAKE8_PATH=$CONDA_HOME/bin/flake8
 
 # sphinx path
-SPHINX_PATH=$CURRENT_DIR/.conda/bin/sphinx-build
+SPHINX_PATH=$CONDA_HOME/bin/sphinx-build
 
 _OLD_PATH="$PATH"
 
@@ -552,7 +552,7 @@ SUPPORT_OS=("Darwin" "Linux")
 STAGE_FILE=$CURRENT_DIR/.stage.txt
 
 # the dir includes all kinds of py env installed.
-VIRTUAL_ENV=$CURRENT_DIR/.conda/envs
+VIRTUAL_ENV=$CONDA_HOME/envs
 
 LOG_DIR=$CURRENT_DIR/log
 



[flink] 01/02: [FLINK-12931][python] Fix lint-python.sh cannot find flake8

2019-06-24 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git

commit ea125a3d14a07935cd5766fcbe411254595c386b
Author: Dian Fu 
AuthorDate: Sun Jun 23 12:26:42 2019 +0800

[FLINK-12931][python] Fix lint-python.sh cannot find flake8
---
 flink-python/dev/lint-python.sh | 15 +--
 1 file changed, 9 insertions(+), 6 deletions(-)

diff --git a/flink-python/dev/lint-python.sh b/flink-python/dev/lint-python.sh
index c3a8339..e681589 100755
--- a/flink-python/dev/lint-python.sh
+++ b/flink-python/dev/lint-python.sh
@@ -204,7 +204,7 @@ function install_py_env() {
 # In some situations,you need to run the script with "sudo". e.g. sudo 
./lint-python.sh
 function install_tox() {
 if [ -f "$TOX_PATH" ]; then
-${CONDA_PATH} remove tox -y -q 2>&1 >/dev/null
+${CONDA_PATH} remove -p ${CONDA_HOME} tox -y -q 2>&1 >/dev/null
 if [ $? -ne 0 ]; then
 echo "conda remove tox failed \
 please try to exec the script again.\
@@ -213,7 +213,7 @@ function install_tox() {
 fi
 fi
 
-${CONDA_PATH} install -c conda-forge tox -y -q 2>&1 >/dev/null
+${CONDA_PATH} install -p ${CONDA_HOME} -c conda-forge tox -y -q 2>&1 
>/dev/null
 if [ $? -ne 0 ]; then
 echo "conda install tox failed \
 please try to exec the script again.\
@@ -226,7 +226,7 @@ function install_tox() {
 # In some situations,you need to run the script with "sudo". e.g. sudo 
./lint-python.sh
 function install_flake8() {
 if [ -f "$FLAKE8_PATH" ]; then
-${CONDA_PATH} remove flake8 -y -q 2>&1 >/dev/null
+${CONDA_PATH} remove -p ${CONDA_HOME} flake8 -y -q 2>&1 >/dev/null
 if [ $? -ne 0 ]; then
 echo "conda remove flake8 failed \
 please try to exec the script again.\
@@ -235,7 +235,7 @@ function install_flake8() {
 fi
 fi
 
-${CONDA_PATH} install -c anaconda flake8 -y -q 2>&1 >/dev/null
+${CONDA_PATH} install -p ${CONDA_HOME} -c anaconda flake8 -y -q 2>&1 
>/dev/null
 if [ $? -ne 0 ]; then
 echo "conda install flake8 failed \
 please try to exec the script again.\
@@ -248,7 +248,7 @@ function install_flake8() {
 # In some situations,you need to run the script with "sudo". e.g. sudo 
./lint-python.sh
 function install_sphinx() {
 if [ -f "$SPHINX_PATH" ]; then
-${CONDA_PATH} remove sphinx -y -q 2>&1 >/dev/null
+${CONDA_PATH} remove -p ${CONDA_HOME} sphinx -y -q 2>&1 >/dev/null
 if [ $? -ne 0 ]; then
 echo "conda remove sphinx failed \
 please try to exec the script again.\
@@ -257,7 +257,7 @@ function install_sphinx() {
 fi
 fi
 
-${CONDA_PATH} install -c anaconda sphinx -y -q 2>&1 >/dev/null
+${CONDA_PATH} install -p ${CONDA_HOME} -c anaconda sphinx -y -q 2>&1 
>/dev/null
 if [ $? -ne 0 ]; then
 echo "conda install sphinx failed \
 please try to exec the script again.\
@@ -529,6 +529,9 @@ CURRENT_DIR="$(cd "$( dirname "$0" )" && pwd)"
 FLINK_PYTHON_DIR=$(dirname "$CURRENT_DIR")
 pushd "$FLINK_PYTHON_DIR" &> /dev/null
 
+# conda home path
+CONDA_HOME=$CURRENT_DIR/.conda
+
 # conda path
 CONDA_PATH=$CURRENT_DIR/.conda/bin/conda
 



[flink] branch master updated (2949166 -> 8d98223)

2019-06-24 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git.


from 2949166  [FLIN-12663]Implement HiveTableSource to read Hive tables
 new ea125a3  [FLINK-12931][python] Fix lint-python.sh cannot find flake8
 new 8d98223  [hotfix][python]Use of a variable like $X instead of ${X}

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 flink-python/dev/lint-python.sh | 33 ++---
 1 file changed, 18 insertions(+), 15 deletions(-)



[flink] annotated tag release-1.8.1-rc1 created (now 29976ef)

2019-06-23 Thread jincheng
This is an automated email from the ASF dual-hosted git repository.

jincheng pushed a change to annotated tag release-1.8.1-rc1
in repository https://gitbox.apache.org/repos/asf/flink.git.


  at 29976ef  (tag)
 tagging 11ab983ed20068dac93efe7f234ffab9abc2926e (commit)
 replaces pre-apache-rename
  by sunjincheng121
  on Mon Jun 24 08:58:15 2019 +0800

- Log -
release-1.8.1-rc1
-BEGIN PGP SIGNATURE-

iQIzBAABCAAdFiEEj+oe6dAEjAzMcLdXMhGwcDt56g4FAl0QICcACgkQMhGwcDt5
6g6G3Q/+N3LXg1hWTr+FMEnTdrbmvb2m8/l1P+PNf/P5PbV7PrlqE2qLMijxlTI3
DIZL2G3+PRsf68YeZUiGzA5qMI2aygc1cZW5TrBD1927ioZBi2X+bHkOBF18wjB+
l0IqoagRKh8r5T7IWqN/TWuTX1z4EPTb/bxKiPu2Is4MDuqMlF4tiZpEhdN9a4vz
iT1gyBgY3UG4qX98uEb8ClARSjfTrFDn5us/t0mpwFoTh5hItzHZuJPjf2cNRgji
QMRL9qm+Y4NCB74onfdjGfns+lvFRHEUwtaVkVMJKlPNHhFmPCJKVXQG115VgUUZ
tlrVZw4ipj1rCd9kNB2ve6Pzyaulh2BOhIG7I5CzbaZKN8PbUHlgKY5KEi9CjONE
m3jKQWWflEN80cStZkxq4QHBqsdRJRnDRkv272hv2P7r1XjgZIzvmPno8+dGikDv
49M2tNO0ColQ3uCVPkebaOJEHEVnXqoodVCabGnRQE6VfS0I2rXclZ55ymZqyNTI
w2Y1/+j2QEVcU+b6afNUOg9bDWxmYEWEs2THrStkGIP+tXFILt21nkxfmjNp9Xvi
xJV6gQl6Ox8fskwbLRZmfR0JfYyxEHa4XtPJ7egrII0MugDqpbzr0X1ZCcxoU2bx
eHp2aymvsYKporP4rD4jdoYjCYmCd1f/8NjDroAC5wBUsAp0klk=
=bOLX
-END PGP SIGNATURE-
---

This annotated tag includes the following new commits:

 new 11ab983  Commit for release 1.8.1

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.




  1   2   3   >