[flink] branch master updated (8a27bd9 -> 75ad29c)
This is an automated email from the ASF dual-hosted git repository. bli pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/flink.git. from 8a27bd9 [FLINK-16691][python][docs] Improve Python UDF documentation to remind users to install PyFlink on the cluster new 74b8bde [FLINK-16471][jdbc] develop JDBCCatalog and PostgresCatalog new 75ad29c [FLINK-16472] support precision of timestamp and time data types The 2 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: .../src/test/resources/log4j2-test.properties | 28 -- flink-connectors/flink-jdbc/pom.xml| 36 ++- .../java/io/jdbc/catalog/AbstractJDBCCatalog.java | 277 ++ .../api/java/io/jdbc/catalog/JDBCCatalog.java | 84 ++ .../api/java/io/jdbc/catalog/JDBCCatalogUtils.java | 54 .../api/java/io/jdbc/catalog/PostgresCatalog.java | 322 .../java/io/jdbc/catalog/PostgresTablePath.java| 95 ++ .../api/java/io/jdbc/dialect/JDBCDialects.java | 10 +- .../java/io/jdbc/catalog/JDBCCatalogUtilsTest.java | 25 +- .../io/jdbc/catalog/PostgresCatalogITCase.java | 325 + .../io/jdbc/catalog/PostgresTablePathTest.java | 12 +- 11 files changed, 1222 insertions(+), 46 deletions(-) delete mode 100644 flink-connectors/flink-connector-elasticsearch2/src/test/resources/log4j2-test.properties create mode 100644 flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/catalog/AbstractJDBCCatalog.java create mode 100644 flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/catalog/JDBCCatalog.java create mode 100644 flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/catalog/JDBCCatalogUtils.java create mode 100644 flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/catalog/PostgresCatalog.java create mode 100644 flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/catalog/PostgresTablePath.java copy flink-runtime/src/main/java/org/apache/flink/runtime/io/network/partition/PartitionNotFoundException.java => flink-connectors/flink-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/catalog/JDBCCatalogUtilsTest.java (60%) create mode 100644 flink-connectors/flink-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/catalog/PostgresCatalogITCase.java copy flink-table/flink-table-common/src/test/java/org/apache/flink/table/module/CoreModuleTest.java => flink-connectors/flink-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/catalog/PostgresTablePathTest.java (72%)
[flink] 01/02: [FLINK-16471][jdbc] develop JDBCCatalog and PostgresCatalog
This is an automated email from the ASF dual-hosted git repository. bli pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/flink.git commit 74b8bdee9fb0bf7cfc27ca8d992dac2a07473a0c Author: bowen.li AuthorDate: Fri Mar 6 14:05:27 2020 -0800 [FLINK-16471][jdbc] develop JDBCCatalog and PostgresCatalog closes #11336 --- .../src/test/resources/log4j2-test.properties | 28 -- flink-connectors/flink-jdbc/pom.xml| 36 ++- .../java/io/jdbc/catalog/AbstractJDBCCatalog.java | 277 ++ .../api/java/io/jdbc/catalog/JDBCCatalog.java | 84 ++ .../api/java/io/jdbc/catalog/JDBCCatalogUtils.java | 54 .../api/java/io/jdbc/catalog/PostgresCatalog.java | 323 .../java/io/jdbc/catalog/PostgresTablePath.java| 95 ++ .../api/java/io/jdbc/dialect/JDBCDialects.java | 10 +- .../java/io/jdbc/catalog/JDBCCatalogUtilsTest.java | 44 +++ .../io/jdbc/catalog/PostgresCatalogITCase.java | 325 + .../io/jdbc/catalog/PostgresTablePathTest.java | 33 +++ 11 files changed, 1275 insertions(+), 34 deletions(-) diff --git a/flink-connectors/flink-connector-elasticsearch2/src/test/resources/log4j2-test.properties b/flink-connectors/flink-connector-elasticsearch2/src/test/resources/log4j2-test.properties deleted file mode 100644 index 835c2ec..000 --- a/flink-connectors/flink-connector-elasticsearch2/src/test/resources/log4j2-test.properties +++ /dev/null @@ -1,28 +0,0 @@ - -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - - -# Set root logger level to OFF to not flood build logs -# set manually to INFO for debugging purposes -rootLogger.level = OFF -rootLogger.appenderRef.test.ref = TestLogger - -appender.testlogger.name = TestLogger -appender.testlogger.type = CONSOLE -appender.testlogger.target = SYSTEM_ERR -appender.testlogger.layout.type = PatternLayout -appender.testlogger.layout.pattern = %-4r [%t] %-5p %c %x - %m%n diff --git a/flink-connectors/flink-jdbc/pom.xml b/flink-connectors/flink-jdbc/pom.xml index 3e83311..cb7afab 100644 --- a/flink-connectors/flink-jdbc/pom.xml +++ b/flink-connectors/flink-jdbc/pom.xml @@ -35,6 +35,11 @@ under the License. jar + + 42.2.10 + 0.13.3 + + @@ -53,13 +58,17 @@ under the License. provided + + - org.apache.derby - derby - 10.14.2.0 - test + org.postgresql + postgresql + ${postgres.version} + provided + + org.apache.flink flink-test-utils_${scala.binary.version} @@ -89,5 +98,24 @@ under the License. ${project.version} test + + + + + com.opentable.components + otj-pg-embedded + ${otj-pg-embedded.version} + test + + + + + + org.apache.derby + derby + 10.14.2.0 + test + + diff --git a/flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/catalog/AbstractJDBCCatalog.java b/flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/catalog/AbstractJDBCCatalog.java new file mode 100644 index 000..523de83 --- /dev/null +++ b/flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/catalog/AbstractJDBCCatalog.java @@ -0,0 +1,277 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information +
[flink] 02/02: [FLINK-16472] support precision of timestamp and time data types
This is an automated email from the ASF dual-hosted git repository. bli pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/flink.git commit 75ad29cb9f4f377df27b71e67dbd33f36bb08bee Author: bowen.li AuthorDate: Sun Mar 8 20:58:27 2020 -0700 [FLINK-16472] support precision of timestamp and time data types closes #11336 --- .../api/java/io/jdbc/catalog/PostgresCatalog.java | 15 +++--- .../io/jdbc/catalog/PostgresCatalogITCase.java | 24 +++--- 2 files changed, 19 insertions(+), 20 deletions(-) diff --git a/flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/catalog/PostgresCatalog.java b/flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/catalog/PostgresCatalog.java index d12f254..c598073 100644 --- a/flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/catalog/PostgresCatalog.java +++ b/flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/catalog/PostgresCatalog.java @@ -55,8 +55,6 @@ public class PostgresCatalog extends AbstractJDBCCatalog { private static final Logger LOG = LoggerFactory.getLogger(PostgresCatalog.class); - public static final String POSTGRES_TABLE_TYPE = "postgres"; - public static final String DEFAULT_DATABASE = "postgres"; // -- Postgres default objects that shouldn't be exposed to users -- @@ -236,6 +234,7 @@ public class PostgresCatalog extends AbstractJDBCCatalog { String pgType = metadata.getColumnTypeName(colIndex); int precision = metadata.getPrecision(colIndex); + int scale = metadata.getScale(colIndex); switch (pgType) { case PG_BOOLEAN: @@ -286,17 +285,17 @@ public class PostgresCatalog extends AbstractJDBCCatalog { case PG_TEXT_ARRAY: return DataTypes.ARRAY(DataTypes.STRING()); case PG_TIMESTAMP: - return DataTypes.TIMESTAMP(); + return DataTypes.TIMESTAMP(scale); case PG_TIMESTAMP_ARRAY: - return DataTypes.ARRAY(DataTypes.TIMESTAMP()); + return DataTypes.ARRAY(DataTypes.TIMESTAMP(scale)); case PG_TIMESTAMPTZ: - return DataTypes.TIMESTAMP_WITH_LOCAL_TIME_ZONE(); + return DataTypes.TIMESTAMP_WITH_LOCAL_TIME_ZONE(scale); case PG_TIMESTAMPTZ_ARRAY: - return DataTypes.ARRAY(DataTypes.TIMESTAMP_WITH_LOCAL_TIME_ZONE()); + return DataTypes.ARRAY(DataTypes.TIMESTAMP_WITH_LOCAL_TIME_ZONE(scale)); case PG_TIME: - return DataTypes.TIME(); + return DataTypes.TIME(scale); case PG_TIME_ARRAY: - return DataTypes.ARRAY(DataTypes.TIME()); + return DataTypes.ARRAY(DataTypes.TIME(scale)); case PG_DATE: return DataTypes.DATE(); case PG_DATE_ARRAY: diff --git a/flink-connectors/flink-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/catalog/PostgresCatalogITCase.java b/flink-connectors/flink-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/catalog/PostgresCatalogITCase.java index e103780..b197bf0 100644 --- a/flink-connectors/flink-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/catalog/PostgresCatalogITCase.java +++ b/flink-connectors/flink-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/catalog/PostgresCatalogITCase.java @@ -249,14 +249,14 @@ public class PostgresCatalogITCase { .field("character_arr", DataTypes.ARRAY(DataTypes.CHAR(3))) .field("character_varying", DataTypes.VARCHAR(20)) .field("character_varying_arr", DataTypes.ARRAY(DataTypes.VARCHAR(20))) - .field("timestamp", DataTypes.TIMESTAMP()) - .field("timestamp_arr", DataTypes.ARRAY(DataTypes.TIMESTAMP())) - .field("timestamptz", DataTypes.TIMESTAMP_WITH_LOCAL_TIME_ZONE()) - .field("timestamptz_arr", DataTypes.ARRAY(DataTypes.TIMESTAMP_WITH_LOCAL_TIME_ZONE())) + .field("timestamp", DataTypes.TIMESTAMP(5)) + .field("timestamp_arr", DataTypes.ARRAY(DataTypes.TIMESTAMP(5))) + .field("timestamptz", DataTypes.TIMESTAMP_WITH_LOCAL_TIME_ZONE(4)) + .field("timestamptz_arr", DataTypes.ARRAY(DataTypes.TIMESTAMP_WITH_LOCAL_TIME_ZONE(4)))
[flink-playgrounds] branch release-1.9 updated: [FLINK-16540] Fully specify bugfix version of Flink images in docker-compose.yaml
This is an automated email from the ASF dual-hosted git repository. fhueske pushed a commit to branch release-1.9 in repository https://gitbox.apache.org/repos/asf/flink-playgrounds.git The following commit(s) were added to refs/heads/release-1.9 by this push: new 95d2fdc [FLINK-16540] Fully specify bugfix version of Flink images in docker-compose.yaml 95d2fdc is described below commit 95d2fdc0078df96b2ae2b4a40ccde76f83327f8a Author: Fabian Hueske AuthorDate: Wed Mar 11 11:26:17 2020 +0100 [FLINK-16540] Fully specify bugfix version of Flink images in docker-compose.yaml * Update Flink version to 1.9.2 This closes #10. --- docker/ops-playground-image/Dockerfile| 2 +- operations-playground/docker-compose.yaml | 6 +++--- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/docker/ops-playground-image/Dockerfile b/docker/ops-playground-image/Dockerfile index 59b40a0..d931804 100644 --- a/docker/ops-playground-image/Dockerfile +++ b/docker/ops-playground-image/Dockerfile @@ -32,7 +32,7 @@ RUN mvn clean install # Build Operations Playground Image ### -FROM flink:1.9.0-scala_2.11 +FROM flink:1.9.2-scala_2.11 WORKDIR /opt/flink/bin diff --git a/operations-playground/docker-compose.yaml b/operations-playground/docker-compose.yaml index 5a88b98..270bb2d 100644 --- a/operations-playground/docker-compose.yaml +++ b/operations-playground/docker-compose.yaml @@ -20,7 +20,7 @@ version: "2.1" services: client: build: ../docker/ops-playground-image -image: apache/flink-ops-playground:2-FLINK-1.9-scala_2.11 +image: apache/flink-ops-playground:3-FLINK-1.9-scala_2.11 command: "flink run -d -p 2 /opt/ClickCountJob.jar --bootstrap.servers kafka:9092 --checkpointing --event-time" depends_on: - jobmanager @@ -35,7 +35,7 @@ services: depends_on: - kafka jobmanager: -image: flink:1.9-scala_2.11 +image: flink:1.9.2-scala_2.11 command: "jobmanager.sh start-foreground" ports: - 8081:8081 @@ -46,7 +46,7 @@ services: environment: - JOB_MANAGER_RPC_ADDRESS=jobmanager taskmanager: -image: flink:1.9-scala_2.11 +image: flink:1.9.2-scala_2.11 depends_on: - jobmanager command: "taskmanager.sh start-foreground"
[flink-playgrounds] branch release-1.10 updated: [FLINK-16540] Fully specify bugfix version of Flink images in docker-compose.yaml
This is an automated email from the ASF dual-hosted git repository. fhueske pushed a commit to branch release-1.10 in repository https://gitbox.apache.org/repos/asf/flink-playgrounds.git The following commit(s) were added to refs/heads/release-1.10 by this push: new f3261ca [FLINK-16540] Fully specify bugfix version of Flink images in docker-compose.yaml f3261ca is described below commit f3261ca2bcfb69439050024cd94f2ceae488b0f1 Author: Fabian Hueske AuthorDate: Wed Mar 11 11:26:17 2020 +0100 [FLINK-16540] Fully specify bugfix version of Flink images in docker-compose.yaml This closes #10. --- operations-playground/docker-compose.yaml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/operations-playground/docker-compose.yaml b/operations-playground/docker-compose.yaml index 4b25f15..919f648 100644 --- a/operations-playground/docker-compose.yaml +++ b/operations-playground/docker-compose.yaml @@ -35,7 +35,7 @@ services: depends_on: - kafka jobmanager: -image: flink:1.10-scala_2.11 +image: flink:1.10.0-scala_2.11 command: "jobmanager.sh start-foreground" ports: - 8081:8081 @@ -46,7 +46,7 @@ services: environment: - JOB_MANAGER_RPC_ADDRESS=jobmanager taskmanager: -image: flink:1.10-scala_2.11 +image: flink:1.10.0-scala_2.11 depends_on: - jobmanager command: "taskmanager.sh start-foreground"
[flink-playgrounds] branch master updated: [FLINK-16540] Fully specify bugfix version of Flink images in docker-compose.yaml
This is an automated email from the ASF dual-hosted git repository. fhueske pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/flink-playgrounds.git The following commit(s) were added to refs/heads/master by this push: new a27301e [FLINK-16540] Fully specify bugfix version of Flink images in docker-compose.yaml a27301e is described below commit a27301ecaace8bacefb2464ef0a788b81ba11827 Author: Fabian Hueske AuthorDate: Wed Mar 11 11:26:17 2020 +0100 [FLINK-16540] Fully specify bugfix version of Flink images in docker-compose.yaml This closes #10. --- operations-playground/docker-compose.yaml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/operations-playground/docker-compose.yaml b/operations-playground/docker-compose.yaml index 4b25f15..919f648 100644 --- a/operations-playground/docker-compose.yaml +++ b/operations-playground/docker-compose.yaml @@ -35,7 +35,7 @@ services: depends_on: - kafka jobmanager: -image: flink:1.10-scala_2.11 +image: flink:1.10.0-scala_2.11 command: "jobmanager.sh start-foreground" ports: - 8081:8081 @@ -46,7 +46,7 @@ services: environment: - JOB_MANAGER_RPC_ADDRESS=jobmanager taskmanager: -image: flink:1.10-scala_2.11 +image: flink:1.10.0-scala_2.11 depends_on: - jobmanager command: "taskmanager.sh start-foreground"
[flink] branch release-1.10 updated: Revert "[FLINK-16691][python][docs] Improve Python UDF documentation to remind users to install PyFlink on the cluster"
This is an automated email from the ASF dual-hosted git repository. dianfu pushed a commit to branch release-1.10 in repository https://gitbox.apache.org/repos/asf/flink.git The following commit(s) were added to refs/heads/release-1.10 by this push: new 8abd110 Revert "[FLINK-16691][python][docs] Improve Python UDF documentation to remind users to install PyFlink on the cluster" 8abd110 is described below commit 8abd110a718942810bf59d76b53e0de8e8cf532b Author: Dian Fu AuthorDate: Fri Mar 20 22:59:48 2020 +0800 Revert "[FLINK-16691][python][docs] Improve Python UDF documentation to remind users to install PyFlink on the cluster" This reverts commit d173d337cda85084bb7890054bfe04cecf1f9694 --- docs/dev/table/python/python_udfs.md| 2 -- docs/dev/table/python/python_udfs.zh.md | 2 -- 2 files changed, 4 deletions(-) diff --git a/docs/dev/table/python/python_udfs.md b/docs/dev/table/python/python_udfs.md index fd4a99d..44a30ff 100644 --- a/docs/dev/table/python/python_udfs.md +++ b/docs/dev/table/python/python_udfs.md @@ -24,8 +24,6 @@ under the License. User-defined functions are important features, because they significantly extend the expressiveness of Python Table API programs. -**NOTE:** Python UDF execution requires Python3.5+ with PyFlink installed. It's required on both the client side and the cluster side. - * This will be replaced by the TOC {:toc} diff --git a/docs/dev/table/python/python_udfs.zh.md b/docs/dev/table/python/python_udfs.zh.md index 11f896c..7acba1f 100644 --- a/docs/dev/table/python/python_udfs.zh.md +++ b/docs/dev/table/python/python_udfs.zh.md @@ -24,8 +24,6 @@ under the License. User-defined functions are important features, because they significantly extend the expressiveness of Python Table API programs. -**NOTE:** Python UDF execution requires Python3.5+ with PyFlink installed. It's required on both the client side and the cluster side. - * This will be replaced by the TOC {:toc}
[flink] branch 1.10 created (now 8a27bd9)
This is an automated email from the ASF dual-hosted git repository. jincheng pushed a change to branch 1.10 in repository https://gitbox.apache.org/repos/asf/flink.git. at 8a27bd9 [FLINK-16691][python][docs] Improve Python UDF documentation to remind users to install PyFlink on the cluster No new revisions were added by this update.
[flink] 01/02: [FLINK-16538][python][docs] Restructure Python Table API documentation (#11375)
This is an automated email from the ASF dual-hosted git repository. dianfu pushed a commit to branch release-1.10 in repository https://gitbox.apache.org/repos/asf/flink.git commit afe7c8e8346e7c8ecc844d8405619eed1ee588ad Author: Dian Fu AuthorDate: Fri Mar 13 14:32:57 2020 +0800 [FLINK-16538][python][docs] Restructure Python Table API documentation (#11375) Adds an item "Python Table API" under "Table API & SQL" and move the documentation about Python Table API under it. --- docs/dev/table/config.md | 4 - docs/dev/table/functions/udfs.md | 151 ++ docs/dev/table/functions/udfs.zh.md | 151 ++ docs/dev/table/python/dependency_management.md| 94 ++ docs/dev/table/python/dependency_management.zh.md | 94 ++ docs/dev/table/python/index.md| 35 + docs/dev/table/python/index.zh.md | 35 + docs/dev/table/python/installation.md | 44 +++ docs/dev/table/python/installation.zh.md | 44 +++ docs/dev/table/python/python_config.md| 32 + docs/dev/table/python/python_config.zh.md | 32 + docs/dev/table/python/python_udfs.md | 125 ++ docs/dev/table/python/python_udfs.zh.md | 125 ++ docs/ops/python_shell.md | 7 +- docs/ops/python_shell.zh.md | 7 +- docs/tutorials/python_table_api.md| 16 +-- docs/tutorials/python_table_api.zh.md | 12 +- 17 files changed, 682 insertions(+), 326 deletions(-) diff --git a/docs/dev/table/config.md b/docs/dev/table/config.md index a9e9b71..dd312d6 100644 --- a/docs/dev/table/config.md +++ b/docs/dev/table/config.md @@ -104,7 +104,3 @@ The following options can be used to tune the performance of the query execution The following options can be used to adjust the behavior of the query optimizer to get a better execution plan. {% include generated/optimizer_config_configuration.html %} - -### Python Options - -{% include generated/python_configuration.html %} diff --git a/docs/dev/table/functions/udfs.md b/docs/dev/table/functions/udfs.md index 523cd61..37cfd04 100644 --- a/docs/dev/table/functions/udfs.md +++ b/docs/dev/table/functions/udfs.md @@ -134,37 +134,12 @@ object TimestampModifier extends ScalarFunction { -Note Python 3.5+ and apache-beam==2.15.0 are required to run the Python scalar function. +In order to define a Python scalar function, one can extend the base class `ScalarFunction` in `pyflink.table.udf` and implement an evaluation method. The behavior of a Python scalar function is determined by the evaluation method which is named `eval`. -Note By default PyFlink uses the command “python” to run the python udf workers. Before starting cluster, run the following command to confirm that it meets the requirements: - -{% highlight bash %} -$ python --version -# the version printed here must be 3.5+ -$ python -m pip install apache-beam==2.15.0 -{% endhighlight %} - -Note Currently, Python UDF is supported in Blink planner both under streaming and batch mode while is only supported under streaming mode in old planner. - -It supports to use both Java/Scala scalar functions and Python scalar functions in Python Table API and SQL. In order to define a Python scalar function, one can extend the base class `ScalarFunction` in `pyflink.table.udf` and implement an evaluation method. The behavior of a Python scalar function is determined by the evaluation method. An evaluation method must be named `eval`. Evaluation method can also support variable arguments, such as `eval(*args)`. - -The following example shows how to define your own Java and Python hash code functions, register them in the TableEnvironment, and call them in a query. Note that you can configure your scalar function via a constructor before it is registered: +The following example shows how to define your own Python hash code function, register it in the TableEnvironment, and call it in a query. Note that you can configure your scalar function via a constructor before it is registered: {% highlight python %} -''' -Java code: - -// The Java class must have a public no-argument constructor and can be founded in current Java classloader. -public class HashCode extends ScalarFunction { - private int factor = 12; - - public int eval(String s) { - return s.hashCode() * factor; - } -} -''' - -class PyHashCode(ScalarFunction): +class HashCode(ScalarFunction): def __init__(self): self.factor = 12 @@ -173,128 +148,18 @@ class PyHashCode(ScalarFunction): table_env = BatchTableEnvironment.create(env) -# register the Java function -table_env.register_java_function("hashCode", "my.java.function.HashCode") - # register the Python function -table_env.register_function("py_hash_code", udf(PyHash
[flink] 02/02: [FLINK-16691][python][docs] Improve Python UDF documentation to remind users to install PyFlink on the cluster
This is an automated email from the ASF dual-hosted git repository. dianfu pushed a commit to branch release-1.10 in repository https://gitbox.apache.org/repos/asf/flink.git commit d173d337cda85084bb7890054bfe04cecf1f9694 Author: huangxingbo AuthorDate: Fri Mar 20 17:41:38 2020 +0800 [FLINK-16691][python][docs] Improve Python UDF documentation to remind users to install PyFlink on the cluster This closes #11462. --- docs/dev/table/python/python_udfs.md| 2 ++ docs/dev/table/python/python_udfs.zh.md | 2 ++ 2 files changed, 4 insertions(+) diff --git a/docs/dev/table/python/python_udfs.md b/docs/dev/table/python/python_udfs.md index 44a30ff..fd4a99d 100644 --- a/docs/dev/table/python/python_udfs.md +++ b/docs/dev/table/python/python_udfs.md @@ -24,6 +24,8 @@ under the License. User-defined functions are important features, because they significantly extend the expressiveness of Python Table API programs. +**NOTE:** Python UDF execution requires Python3.5+ with PyFlink installed. It's required on both the client side and the cluster side. + * This will be replaced by the TOC {:toc} diff --git a/docs/dev/table/python/python_udfs.zh.md b/docs/dev/table/python/python_udfs.zh.md index 7acba1f..11f896c 100644 --- a/docs/dev/table/python/python_udfs.zh.md +++ b/docs/dev/table/python/python_udfs.zh.md @@ -24,6 +24,8 @@ under the License. User-defined functions are important features, because they significantly extend the expressiveness of Python Table API programs. +**NOTE:** Python UDF execution requires Python3.5+ with PyFlink installed. It's required on both the client side and the cluster side. + * This will be replaced by the TOC {:toc}
[flink] branch release-1.10 updated (3d46037 -> d173d33)
This is an automated email from the ASF dual-hosted git repository. dianfu pushed a change to branch release-1.10 in repository https://gitbox.apache.org/repos/asf/flink.git. from 3d46037 [hotfix][FLINK-16220][json] Fix compile problem in JsonRowSerializationSchemaTest new afe7c8e [FLINK-16538][python][docs] Restructure Python Table API documentation (#11375) new d173d33 [FLINK-16691][python][docs] Improve Python UDF documentation to remind users to install PyFlink on the cluster The 2 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference. Summary of changes: docs/dev/table/config.md | 4 - docs/dev/table/functions/udfs.md | 151 ++--- docs/dev/table/functions/udfs.zh.md| 151 ++--- docs/dev/table/python/dependency_management.md | 94 + docs/dev/table/python/dependency_management.zh.md | 94 + docs/dev/table/python/index.md | 35 + docs/dev/table/python/index.zh.md | 35 + .../table/python/installation.md} | 27 +++- .../table/python/installation.zh.md} | 27 +++- .../{tuning/index.md => python/python_config.md} | 15 +- .../index.md => python/python_config.zh.md}| 15 +- docs/dev/table/python/python_udfs.md | 127 + docs/dev/table/python/python_udfs.zh.md| 127 + docs/ops/python_shell.md | 7 +- docs/ops/python_shell.zh.md| 7 +- docs/tutorials/python_table_api.md | 16 +-- docs/tutorials/python_table_api.zh.md | 12 +- 17 files changed, 602 insertions(+), 342 deletions(-) create mode 100644 docs/dev/table/python/dependency_management.md create mode 100644 docs/dev/table/python/dependency_management.zh.md create mode 100644 docs/dev/table/python/index.md create mode 100644 docs/dev/table/python/index.zh.md copy docs/{getting-started/tutorials/api_tutorials.zh.md => dev/table/python/installation.md} (52%) copy docs/{getting-started/tutorials/api_tutorials.zh.md => dev/table/python/installation.zh.md} (52%) copy docs/dev/table/{tuning/index.md => python/python_config.md} (55%) copy docs/dev/table/{tuning/index.md => python/python_config.zh.md} (55%) create mode 100644 docs/dev/table/python/python_udfs.md create mode 100644 docs/dev/table/python/python_udfs.zh.md
[flink] branch master updated (1be88b1 -> 8a27bd9)
This is an automated email from the ASF dual-hosted git repository. dianfu pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/flink.git. from 1be88b1 [FLINK-16011] Distinguish between no time window and 0 length time windwo in NFACompiler add 8a27bd9 [FLINK-16691][python][docs] Improve Python UDF documentation to remind users to install PyFlink on the cluster No new revisions were added by this update. Summary of changes: docs/dev/table/python/python_udfs.md| 2 ++ docs/dev/table/python/python_udfs.zh.md | 2 ++ 2 files changed, 4 insertions(+)
[flink] branch master updated (b226cbd -> 1be88b1)
This is an automated email from the ASF dual-hosted git repository. trohrmann pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/flink.git. from b226cbd [FLINK-16633][AZP] Fix builds without s3 credentials add 96dc51f [FLINK-16011] fix the bug that with will not effect if not in the end of pattern add 1be88b1 [FLINK-16011] Distinguish between no time window and 0 length time windwo in NFACompiler No new revisions were added by this update. Summary of changes: .../apache/flink/cep/nfa/compiler/NFACompiler.java | 12 +++- .../flink/cep/nfa/compiler/NFACompilerTest.java | 21 + 2 files changed, 28 insertions(+), 5 deletions(-)
[flink] branch master updated (0bc38ba -> b226cbd)
This is an automated email from the ASF dual-hosted git repository. rmetzger pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/flink.git. from 0bc38ba [FLINK-15962][network] Reduce the default chunk size to 4M in netty stack add b226cbd [FLINK-16633][AZP] Fix builds without s3 credentials No new revisions were added by this update. Summary of changes: azure-pipelines.yml | 8 +++- .../org/apache/flink/testutils/s3/S3TestCredentials.java | 11 +-- tools/azure-pipelines/build-apache-repo.yml | 3 +++ tools/azure-pipelines/jobs-template.yml | 12 ++-- 4 files changed, 25 insertions(+), 9 deletions(-)
[flink] branch master updated (c60d11d -> 0bc38ba)
This is an automated email from the ASF dual-hosted git repository. zhijiang pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/flink.git. from c60d11d [FLINK-16653][network][tests] Implement MockResultPartitionWriter base for simplifying tests add 0bc38ba [FLINK-15962][network] Reduce the default chunk size to 4M in netty stack No new revisions were added by this update. Summary of changes: .../test-scripts/test_netty_shuffle_memory_control.sh | 4 ++-- .../flink/runtime/io/network/netty/NettyBufferPool.java| 14 -- 2 files changed, 10 insertions(+), 8 deletions(-)
[flink] branch release-1.10 updated: [hotfix][FLINK-16220][json] Fix compile problem in JsonRowSerializationSchemaTest
This is an automated email from the ASF dual-hosted git repository. jark pushed a commit to branch release-1.10 in repository https://gitbox.apache.org/repos/asf/flink.git The following commit(s) were added to refs/heads/release-1.10 by this push: new 3d46037 [hotfix][FLINK-16220][json] Fix compile problem in JsonRowSerializationSchemaTest 3d46037 is described below commit 3d46037c102e32d26e1960c50d8146d4448e4e14 Author: Jark Wu AuthorDate: Fri Mar 20 16:48:53 2020 +0800 [hotfix][FLINK-16220][json] Fix compile problem in JsonRowSerializationSchemaTest --- .../org/apache/flink/formats/json/JsonRowSerializationSchemaTest.java | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonRowSerializationSchemaTest.java b/flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonRowSerializationSchemaTest.java index 378f92b..cf459b5 100644 --- a/flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonRowSerializationSchemaTest.java +++ b/flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonRowSerializationSchemaTest.java @@ -114,8 +114,7 @@ public class JsonRowSerializationSchemaTest { Types.PRIMITIVE_ARRAY(Types.INT)); JsonRowDeserializationSchema deserializationSchema = new JsonRowDeserializationSchema.Builder(schema) .build(); - JsonRowSerializationSchema serializationSchema = JsonRowSerializationSchema.builder() - .withTypeInfo(schema) + JsonRowSerializationSchema serializationSchema = new JsonRowSerializationSchema.Builder(schema) .build(); for (int i = 0; i < jsons.length; i++) {
[flink] branch master updated (0619a5b -> c60d11d)
This is an automated email from the ASF dual-hosted git repository. zhijiang pushed a change to branch master in repository https://gitbox.apache.org/repos/asf/flink.git. from 0619a5b [FLINK-16220][json] Fix cast exception in JsonRowSerializationSchema when serializing null fields add c60d11d [FLINK-16653][network][tests] Implement MockResultPartitionWriter base for simplifying tests No new revisions were added by this update. Summary of changes: .../AbstractCollectingResultPartitionWriter.java | 45 + .../AvailabilityTestResultPartitionWriter.java | 57 +-- .../io/network/api/writer/RecordWriterTest.java| 105 + .../MockResultPartitionWriter.java}| 42 +++-- 4 files changed, 21 insertions(+), 228 deletions(-) copy flink-runtime/src/test/java/org/apache/flink/runtime/io/network/{api/writer/AvailabilityTestResultPartitionWriter.java => partition/MockResultPartitionWriter.java} (64%)