Re: 哪里可以下载到downloads/setup-pyflink-virtual-env.sh脚本

2021-11-18 文章 Dian Fu
刚注意到你用的YARN application模式,PyFlink 1.14.0才支持YARN application模式,主要是新增了命令行选项“
-pyclientexec” 和配置“python.client.executable”:
https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/dev/python/python_config/#python-client-executable

对于你这个作业来说,你需要通过使用1.14.0版本,同时添加命令行选项:-pyclientexec venv.zip/venv/bin/python


On Fri, Nov 19, 2021 at 10:48 AM Asahi Lee <978466...@qq.com.invalid> wrote:

> 我通过source my_env/bin/activate和指定环境变量PYFLINK_CLIENT_EXECUTABLE均未提交成功,错误未变化;
> 我的错误中,jobmanager的日志中显示No module named pyflink,而jobmanager是运行在yarn集群上;
> 请问,还有还有什么未配置?
>
>
> > LogType:jobmanager.out
> > Log Upload Time:星期四 十一月 18 20:48:45 +0800 2021
> > LogLength:37
> > Log Contents:
> > /bin/python: No module named pyflink
>
>
>
>
> -- 原始邮件 --
> 发件人:
>   "user-zh"
> <
> dian0511...@gmail.com>;
> 发送时间: 2021年11月19日(星期五) 上午9:38
> 收件人: "user-zh"
> 主题: Re: 哪里可以下载到downloads/setup-pyflink-virtual-env.sh脚本
>
>
>
> -pyexec 指定的是集群端所用的Python环境,客户端需要编译Flink作业,也会依赖Python环境。可以看一下这个文档:
>
> https://nightlies.apache.org/flink/flink-docs-release-1.13/docs/dev/python/dependency_management/#python-interpreter-of-client
>
> On Thu, Nov 18, 2021 at 9:00 PM Asahi Lee <978466...@qq.com.invalid>
> wrote:
>
> > Hi !
> >    我在java Table api中使用python udf
> > 函数,通过下面的命令提交应用,报无法启动python服务错误,请问我的提交方式对吗?jm日志为/bin/python: No module
> named
> > pyflink。
> >
> >
> > ./flink-1.13.2/bin/flink 
> > run-application -t yarn-application 
> >
> -Dyarn.provided.lib.dirs="hdfs://nameservice1/user/flink/flinklib" 
> > -Dyarn.application.queue=d
> > -p 1 
> > -pyarch /opt/venv.zip
> > -pyexec venv.zip/venv/bin/python 
> > -pyfs /opt/test.py 
> > -c test.PyUDFTest 
> > /opt/flink-python-test-1.0-SNAPSHOT.jar
> >
> >
> >
> > 错误:
> > Caused by: java.lang.RuntimeException: Python callback server start
> failed!
> >         at
> >
> org.apache.flink.client.python.PythonFunctionFactory.createPythonFunctionFactory(PythonFunctionFactory.java:167)
> > ~[flink-python_2.11-1.13.2.jar:1.13.2]
> >         at
> >
> org.apache.flink.client.python.PythonFunctionFactory$1.load(PythonFunctionFactory.java:88)
> > ~[flink-python_2.11-1.13.2.jar:1.13.2]
> >         at
> >
> org.apache.flink.client.python.PythonFunctionFactory$1.load(PythonFunctionFactory.java:84)
> > ~[flink-python_2.11-1.13.2.jar:1.13.2]
> >         at
> >
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >         at
> >
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319)
> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >         at
> >
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >         at
> >
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197)
> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >         at
> >
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.get(LocalCache.java:3937)
> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >         at
> >
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941)
> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >         at
> >
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824)
> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >         at
> >
> org.apache.flink.client.python.PythonFunctionFactory.getPythonFunction(PythonFunctionFactory.java:129)
> > ~[flink-python_2.11-1.13.2.jar:1.13.2]
> >         at
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[?:1.8.0_111]
> >         at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > ~[?:1.8.0_111]
> >         at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > ~[?:1.8.0_111]
> >         at
> > java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_111]
> >         at
> >
> org.apache.flink.table.functions.python.utils.PythonFunctionUtils.getPythonFunction(PythonFunctionUtils.java:45)
> > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
> >         at
> >
> org.apache.flink.table.functions.UserDefinedFunctionHelper.instantiateFunction(UserDefinedFunctionHelper.java:206)
> > ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
> >         at
> >
> org.apache.flink.table.catal

?????? ??????????????downloads/setup-pyflink-virtual-env.sh????

2021-11-18 文章 Asahi Lee
??source 
my_env/bin/activate??PYFLINK_CLIENT_EXECUTABLE??
jobmanagerNo module named 
pyflinkjobmanageryarn
??


> LogType:jobmanager.out
> Log Upload Time:?? ?? 18 20:48:45 +0800 2021
> LogLength:37
> Log Contents:
> /bin/python: No module named pyflink




--  --
??: 
   "user-zh"

https://nightlies.apache.org/flink/flink-docs-release-1.13/docs/dev/python/dependency_management/#python-interpreter-of-client

On Thu, Nov 18, 2021 at 9:00 PM Asahi Lee <978466...@qq.com.invalid> wrote:

> Hi !
>    java Table api??python udf
> 
pythonjm??/bin/python:
 No module named
> pyflink??
>
>
> ./flink-1.13.2/bin/flink 
> run-application -t yarn-application 
> 
-Dyarn.provided.lib.dirs="hdfs://nameservice1/user/flink/flinklib" 
> -Dyarn.application.queue=d
> -p 1 
> -pyarch /opt/venv.zip
> -pyexec venv.zip/venv/bin/python 
> -pyfs /opt/test.py 
> -c test.PyUDFTest 
> /opt/flink-python-test-1.0-SNAPSHOT.jar
>
>
>
> ??
> Caused by: java.lang.RuntimeException: Python callback server start failed!
>         at
> 
org.apache.flink.client.python.PythonFunctionFactory.createPythonFunctionFactory(PythonFunctionFactory.java:167)
> ~[flink-python_2.11-1.13.2.jar:1.13.2]
>         at
> 
org.apache.flink.client.python.PythonFunctionFactory$1.load(PythonFunctionFactory.java:88)
> ~[flink-python_2.11-1.13.2.jar:1.13.2]
>         at
> 
org.apache.flink.client.python.PythonFunctionFactory$1.load(PythonFunctionFactory.java:84)
> ~[flink-python_2.11-1.13.2.jar:1.13.2]
>         at
> 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
> ~[flink-dist_2.11-1.13.1.jar:1.13.1]
>         at
> 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319)
> ~[flink-dist_2.11-1.13.1.jar:1.13.1]
>         at
> 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
> ~[flink-dist_2.11-1.13.1.jar:1.13.1]
>         at
> 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197)
> ~[flink-dist_2.11-1.13.1.jar:1.13.1]
>         at
> 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.get(LocalCache.java:3937)
> ~[flink-dist_2.11-1.13.1.jar:1.13.1]
>         at
> 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941)
> ~[flink-dist_2.11-1.13.1.jar:1.13.1]
>         at
> 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824)
> ~[flink-dist_2.11-1.13.1.jar:1.13.1]
>         at
> 
org.apache.flink.client.python.PythonFunctionFactory.getPythonFunction(PythonFunctionFactory.java:129)
> ~[flink-python_2.11-1.13.2.jar:1.13.2]
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_111]
>         at
> 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> ~[?:1.8.0_111]
>         at
> 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[?:1.8.0_111]
>         at
> java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_111]
>         at
> 
org.apache.flink.table.functions.python.utils.PythonFunctionUtils.getPythonFunction(PythonFunctionUtils.java:45)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> 
org.apache.flink.table.functions.UserDefinedFunctionHelper.instantiateFunction(UserDefinedFunctionHelper.java:206)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> 
org.apache.flink.table.catalog.FunctionCatalog.getFunctionDefinition(FunctionCatalog.java:659)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> 
org.apache.flink.table.catalog.FunctionCatalog.resolveAmbiguousFunctionReference(FunctionCatalog.java:606)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> 
org.apache.flink.table.catalog.FunctionCatalog.lookupFunction(FunctionCatalog.java:362)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> 
org.apache.flink.table.planner.catalog.FunctionCatalogOperatorTable.lookupOperatorOverload

flink job进行cancel后kafka producer报错

2021-11-18 文章 yu...@kiscloud.net
Hi All:
   有一个flink的运行正常的job进行cancel后,flink的log里面打印了一些信息,请问这种异常如何排查根因?

TY-APP-DATA-REAL-COMPUTATION - [2021-11-18 22:10:32.465] - INFO  
[RecordComputeOperator -> Sink: wi-data-sink (3/10)] 
org.apache.flink.runtime.taskmanager.Task  - RecordComputeOperator ->
Sink: wi-data-sink (3/10) (b2166a7da829182804f4557a16eabc58) switched from 
CANCELING to CANCELED.
TY-APP-DATA-REAL-COMPUTATION - [2021-11-18 22:10:32.466] - INFO  
[RecordComputeOperator -> Sink: wi-data-sink (9/10)] 
org.apache.kafka.clients.producer.KafkaProducer  - Closing the Kafka p
roducer with timeoutMillis = 9223372036854775807 ms.
TY-APP-DATA-REAL-COMPUTATION - [2021-11-18 22:10:32.467] - ERROR 
[RecordComputeOperator -> Sink: wi-data-sink (9/10)] 
org.apache.flink.streaming.runtime.tasks.StreamTask  - Error during di
sposal of stream operator.
org.apache.flink.streaming.connectors.kafka.FlinkKafka011Exception: Failed to 
send data to Kafka: Failed to close kafka producer
at 
org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.checkErroneous(FlinkKafkaProducer011.java:1026)
at 
org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.close(FlinkKafkaProducer011.java:691)
at 
org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:43)
at 
org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.dispose(AbstractUdfStreamOperator.java:117)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.disposeAllOperators(StreamTask.java:651)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.cleanUpInvoke(StreamTask.java:562)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:480)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:708)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:533)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.kafka.common.KafkaException: Failed to close kafka 
producer
at 
org.apache.kafka.clients.producer.KafkaProducer.close(KafkaProducer.java:1062)
at 
org.apache.kafka.clients.producer.KafkaProducer.close(KafkaProducer.java:1010)
at 
org.apache.kafka.clients.producer.KafkaProducer.close(KafkaProducer.java:989)
at 
org.apache.flink.streaming.connectors.kafka.internal.FlinkKafkaProducer.close(FlinkKafkaProducer.java:195)
at 
org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer011.close(FlinkKafkaProducer011.java:674)
... 8 common frames omitted
Caused by: java.lang.InterruptedException: null
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1253)
at 
org.apache.kafka.clients.producer.KafkaProducer.close(KafkaProducer.java:1031)
... 12 common frames omitted



Re: flink的job运行一段时间后, checkpoint就一直失败

2021-11-18 文章 Caizhi Weng
Hi!

checkpoint 超时有很多可能性。最常见的原因是超时的节点太忙阻塞了 checkpoint(包括计算资源不足,或者数据有倾斜等),这可以通过看
Flink web UI 上的 busy 以及反压信息判断;另外一个常见原因是 gc 太频繁,可以通过设置 jvm 参数打印出 gc log 观察。

yu...@kiscloud.net  于2021年11月18日周四 下午2:54写道:

> flink的job运行一段时间后, checkpoint就一直失败,信息如下:
> ID
> Status
> Acknowledged
> Trigger Time
> Latest Acknowledgement
> End to End Duration
> State Size
> Buffered During Alignment
> 295
> FAILED
> 30/5011:55:3811:55:391h 0m 0s205 KB0 B
> Checkpoint Detail:
> Path: - Discarded: - Failure Message: Checkpoint expired before completing.
> Operators:
> Name
> Acknowledged
> Latest Acknowledgment
> End to End Duration
> State Size
> Buffered During Alignment
> Source: dw-member
> 6/10 (60%)11:55:391s7.08 KB0 B
> Source: wi-order
> 6/10 (60%)11:55:391s7.11 KB0 B
> Source: dw-pay
> 6/10 (60%)11:55:391s7.11 KB0 B
> RecordTransformOperator
> 6/10 (60%)11:55:391s98.8 KB0 B
> RecordComputeOperator -> Sink: dw-record-data-sink
> 6/10 (60%)11:55:391s85.1 KB0 B
> SubTasks:
> End to End Duration
> State Size
> Checkpoint Duration (Sync)
> Checkpoint Duration (Async)
> Alignment Buffered
> Alignment Duration
> Minimum1s14.2 KB7ms841ms0 B13ms
> Average1s14.2 KB94ms1s0 B13ms
> Maximum1s14.2 KB181ms1s0 B15ms
> ID
> Acknowledgement Time
> E2E Duration
> State Size
> Checkpoint Duration (Sync)
> Checkpoint Duration (Async)
> Align Buffered
> Align Duration
> 1n/a
> 211:55:391s14.2 KB8ms1s0 B15ms
> 3n/a
> 411:55:391s14.2 KB181ms1s0 B13ms
> 5n/a
> 611:55:391s14.2 KB8ms1s0 B14ms
> 711:55:391s14.2 KB181ms961ms0 B13ms
> 8n/a
> 911:55:391s14.2 KB181ms841ms0 B13ms
> 1011:55:391s14.2 KB7ms1s0 B14ms
>
>
> 请问,这类问题如何排查,有没有好的建议或者最佳实践?谢谢!
>


Re: inStreamingMode和inBatchMode打印内容不同产生的疑惑

2021-11-18 文章 Caizhi Weng
Hi!

流作业中产生的数据有不同类型,例如插入(+I),删除(-D),更新(-U、+U),它们的具体说明见 [1]。print sink 实际上较多用于
debug,因此用户实际上不太需要关心这些 op 的含义。

批作业中产生的所有数据都是插入,因此不需要 op。

[1]
https://nightlies.apache.org/flink/flink-docs-release-1.14/api/java/org/apache/flink/types/RowKind.html

陈卓宇 <2572805...@qq.com.invalid> 于2021年11月18日周四 下午8:02写道:

> 在inStreamingMode会出现 首列op,行数据是+I
> 在inBatchMode模式下就没有
>
> 请问这个op列和+I是什么意思 为什么inStreamingMode模式下有而inBatchMode模式下没有
>
> 陈
>
>
>  


Re: 哪里可以下载到downloads/setup-pyflink-virtual-env.sh脚本

2021-11-18 文章 Dian Fu
-pyexec 指定的是集群端所用的Python环境,客户端需要编译Flink作业,也会依赖Python环境。可以看一下这个文档:
https://nightlies.apache.org/flink/flink-docs-release-1.13/docs/dev/python/dependency_management/#python-interpreter-of-client

On Thu, Nov 18, 2021 at 9:00 PM Asahi Lee <978466...@qq.com.invalid> wrote:

> Hi !
>    我在java Table api中使用python udf
> 函数,通过下面的命令提交应用,报无法启动python服务错误,请问我的提交方式对吗?jm日志为/bin/python: No module named
> pyflink。
>
>
> ./flink-1.13.2/bin/flink 
> run-application -t yarn-application 
> -Dyarn.provided.lib.dirs="hdfs://nameservice1/user/flink/flinklib" 
> -Dyarn.application.queue=d
> -p 1 
> -pyarch /opt/venv.zip
> -pyexec venv.zip/venv/bin/python 
> -pyfs /opt/test.py 
> -c test.PyUDFTest 
> /opt/flink-python-test-1.0-SNAPSHOT.jar
>
>
>
> 错误:
> Caused by: java.lang.RuntimeException: Python callback server start failed!
>         at
> org.apache.flink.client.python.PythonFunctionFactory.createPythonFunctionFactory(PythonFunctionFactory.java:167)
> ~[flink-python_2.11-1.13.2.jar:1.13.2]
>         at
> org.apache.flink.client.python.PythonFunctionFactory$1.load(PythonFunctionFactory.java:88)
> ~[flink-python_2.11-1.13.2.jar:1.13.2]
>         at
> org.apache.flink.client.python.PythonFunctionFactory$1.load(PythonFunctionFactory.java:84)
> ~[flink-python_2.11-1.13.2.jar:1.13.2]
>         at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
> ~[flink-dist_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319)
> ~[flink-dist_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
> ~[flink-dist_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197)
> ~[flink-dist_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.get(LocalCache.java:3937)
> ~[flink-dist_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941)
> ~[flink-dist_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824)
> ~[flink-dist_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.flink.client.python.PythonFunctionFactory.getPythonFunction(PythonFunctionFactory.java:129)
> ~[flink-python_2.11-1.13.2.jar:1.13.2]
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_111]
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> ~[?:1.8.0_111]
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[?:1.8.0_111]
>         at
> java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_111]
>         at
> org.apache.flink.table.functions.python.utils.PythonFunctionUtils.getPythonFunction(PythonFunctionUtils.java:45)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.flink.table.functions.UserDefinedFunctionHelper.instantiateFunction(UserDefinedFunctionHelper.java:206)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.flink.table.catalog.FunctionCatalog.getFunctionDefinition(FunctionCatalog.java:659)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.flink.table.catalog.FunctionCatalog.resolveAmbiguousFunctionReference(FunctionCatalog.java:606)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.flink.table.catalog.FunctionCatalog.lookupFunction(FunctionCatalog.java:362)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.flink.table.planner.catalog.FunctionCatalogOperatorTable.lookupOperatorOverloads(FunctionCatalogOperatorTable.java:97)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.calcite.sql.util.ChainedSqlOperatorTable.lookupOperatorOverloads(ChainedSqlOperatorTable.java:67)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.calcite.sql.validate.SqlValidatorImpl.performUnconditionalRewrites(SqlValidatorImpl.java:1183)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.calcite.sql.validate.SqlValidatorImpl.performUnconditionalRewrites(SqlValidatorImpl.java:1169)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.calcite.sql.validate.SqlValidatorImpl.performUnconditionalRewrites(SqlValidatorImpl.java:1200)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.calcite.sql.validate.SqlValidatorImpl.performUnconditionalRewrites(SqlValidatorImpl.java:1169)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.calcite.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:945)
> ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
>         at
> org.apache.ca

退订

2021-11-18 文章 drewfranklin


退订
| |
drewfranklin
|
|
drewfrank...@163.com
|
签名由网易邮箱大师定制



回复: flink1.13.1 sql client connect hivecatalog 报错

2021-11-18 文章 drewfranklin
图片如下,同时我也尝试直接使用ddl 的方式注册显示相同的错误,看报错是缺少包,但不知还需要补充哪些包:

在2021年11月18日 19:23,Shengkai Fang 写道:
hi, 看不见图,建议用图床或者填一下代码。

我看到代码中有 yaml 文件,事实上 更建议使用 ddl 来创建相应的 catalog。

best,
Shengkai

drewfranklin  于2021年11月18日周四 下午6:01写道:

Hello, friends !
我按照官方文档使用 sql client 去连接hive catalog 时出错。
我的hive version 2.3.6
Flink version 1.13.1

感觉官方介绍的bundled 方式添加jar 包,在flink/lib 下添加如下截图的包。然后重启集群,启动了sql-client
,连接报错如下,看报错感觉缺包,不知道缺什么包。第二种方式也尝试了下,一样的报错。似乎无法创建catalog 连接。
Yaml 文件:


Reading session environment from:
file:/Users/feng/flink-1.13.1/bin/../catlog_yaml/hiveCatalog.yaml


Exception in thread "main"
org.apache.flink.table.client.SqlClientException: Unexpected exception.
This is a bug. Please consider filing an issue.
at
org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:201)
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:161)
Caused by: org.apache.flink.table.api.ValidationException: Unable to
create catalog 'myhive'.

Catalog options are:
'hive-conf-dir'='/Users/feng/hive-2.3.6/conf'
'type'='hive'
at
org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:270)
at
org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.createCatalog(LegacyTableEnvironmentInitializer.java:217)
at
org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.lambda$initializeCatalogs$1(LegacyTableEnvironmentInitializer.java:120)
at java.base/java.util.HashMap.forEach(HashMap.java:1336)
at
org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.initializeCatalogs(LegacyTableEnvironmentInitializer.java:117)
at
org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.initializeSessionState(LegacyTableEnvironmentInitializer.java:105)
at
org.apache.flink.table.client.gateway.context.SessionContext.create(SessionContext.java:233)
at
org.apache.flink.table.client.gateway.local.LocalContextUtils.buildSessionContext(LocalContextUtils.java:100)
at
org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:91)
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:88)
at
org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:187)
... 1 more
Caused by: org.apache.flink.table.api.TableException: Could not load
service provider for factories.
at
org.apache.flink.table.factories.FactoryUtil.discoverFactories(FactoryUtil.java:507)
at
org.apache.flink.table.factories.FactoryUtil.discoverFactory(FactoryUtil.java:298)
at
org.apache.flink.table.factories.FactoryUtil.getCatalogFactory(FactoryUtil.java:455)
at
org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:251)
... 11 more
Caused by: java.util.ServiceConfigurationError:
org.apache.flink.table.factories.Factory:
org.apache.flink.table.module.hive.HiveModuleFactory not a subtype
at java.base/java.util.ServiceLoader.fail(ServiceLoader.java:589)
at
java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNextService(ServiceLoader.java:1237)
at
java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNext(ServiceLoader.java:1265)
at
java.base/java.util.ServiceLoader$2.hasNext(ServiceLoader.java:1300)
at
java.base/java.util.ServiceLoader$3.hasNext(ServiceLoader.java:1385)
at java.base/java.util.Iterator.forEachRemaining(Iterator.java:132)
at
org.apache.flink.table.factories.FactoryUtil.discoverFactories(FactoryUtil.java:503)
... 14 more



????????????????????downloads/setup-pyflink-virtual-env.sh????

2021-11-18 文章 Asahi Lee
Hi !
   java Table api??python udf 
pythonjm??/bin/python:
 No module named pyflink??


./flink-1.13.2/bin/flink 
run-application -t yarn-application 
-Dyarn.provided.lib.dirs="hdfs://nameservice1/user/flink/flinklib" 
-Dyarn.application.queue=d
-p 1 
-pyarch /opt/venv.zip
-pyexec venv.zip/venv/bin/python 
-pyfs /opt/test.py 
-c test.PyUDFTest 
/opt/flink-python-test-1.0-SNAPSHOT.jar



??
Caused by: java.lang.RuntimeException: Python callback server start failed!
        at 
org.apache.flink.client.python.PythonFunctionFactory.createPythonFunctionFactory(PythonFunctionFactory.java:167)
 ~[flink-python_2.11-1.13.2.jar:1.13.2]
        at 
org.apache.flink.client.python.PythonFunctionFactory$1.load(PythonFunctionFactory.java:88)
 ~[flink-python_2.11-1.13.2.jar:1.13.2]
        at 
org.apache.flink.client.python.PythonFunctionFactory$1.load(PythonFunctionFactory.java:84)
 ~[flink-python_2.11-1.13.2.jar:1.13.2]
        at 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
 ~[flink-dist_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319)
 ~[flink-dist_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
 ~[flink-dist_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197)
 ~[flink-dist_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.get(LocalCache.java:3937)
 ~[flink-dist_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941)
 ~[flink-dist_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.flink.shaded.guava18.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824)
 ~[flink-dist_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.flink.client.python.PythonFunctionFactory.getPythonFunction(PythonFunctionFactory.java:129)
 ~[flink-python_2.11-1.13.2.jar:1.13.2]
        at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_111]
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[?:1.8.0_111]
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:1.8.0_111]
        at java.lang.reflect.Method.invoke(Method.java:498) 
~[?:1.8.0_111]
        at 
org.apache.flink.table.functions.python.utils.PythonFunctionUtils.getPythonFunction(PythonFunctionUtils.java:45)
 ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.flink.table.functions.UserDefinedFunctionHelper.instantiateFunction(UserDefinedFunctionHelper.java:206)
 ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.flink.table.catalog.FunctionCatalog.getFunctionDefinition(FunctionCatalog.java:659)
 ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.flink.table.catalog.FunctionCatalog.resolveAmbiguousFunctionReference(FunctionCatalog.java:606)
 ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.flink.table.catalog.FunctionCatalog.lookupFunction(FunctionCatalog.java:362)
 ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.flink.table.planner.catalog.FunctionCatalogOperatorTable.lookupOperatorOverloads(FunctionCatalogOperatorTable.java:97)
 ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.calcite.sql.util.ChainedSqlOperatorTable.lookupOperatorOverloads(ChainedSqlOperatorTable.java:67)
 ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.calcite.sql.validate.SqlValidatorImpl.performUnconditionalRewrites(SqlValidatorImpl.java:1183)
 ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.calcite.sql.validate.SqlValidatorImpl.performUnconditionalRewrites(SqlValidatorImpl.java:1169)
 ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.calcite.sql.validate.SqlValidatorImpl.performUnconditionalRewrites(SqlValidatorImpl.java:1200)
 ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.calcite.sql.validate.SqlValidatorImpl.performUnconditionalRewrites(SqlValidatorImpl.java:1169)
 ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.calcite.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:945)
 ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.calcite.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:704)
 ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:152)
 ~[flink-table-blink_2.11-1.13.1.jar:1.13.1]
        at 
org.apache.flink.table.planner.

inStreamingMode??inBatchMode??????????????????????

2021-11-18 文章 ??????
??inStreamingMode?? op??+I
??inBatchMode

op+I?? ??inStreamingMode??inBatchMode??

??


 

Re: flink1.13.1 sql client connect hivecatalog 报错

2021-11-18 文章 Shengkai Fang
hi, 看不见图,建议用图床或者填一下代码。

我看到代码中有 yaml 文件,事实上 更建议使用 ddl 来创建相应的 catalog。

best,
Shengkai

drewfranklin  于2021年11月18日周四 下午6:01写道:

> Hello, friends !
>我按照官方文档使用 sql client 去连接hive catalog 时出错。
> 我的hive version 2.3.6
>   Flink version 1.13.1
>
> 感觉官方介绍的bundled 方式添加jar 包,在flink/lib 下添加如下截图的包。然后重启集群,启动了sql-client
> ,连接报错如下,看报错感觉缺包,不知道缺什么包。第二种方式也尝试了下,一样的报错。似乎无法创建catalog 连接。
> Yaml 文件:
>
>
> Reading session environment from:
> file:/Users/feng/flink-1.13.1/bin/../catlog_yaml/hiveCatalog.yaml
>
>
> Exception in thread "main"
> org.apache.flink.table.client.SqlClientException: Unexpected exception.
> This is a bug. Please consider filing an issue.
>at
> org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:201)
>at org.apache.flink.table.client.SqlClient.main(SqlClient.java:161)
> Caused by: org.apache.flink.table.api.ValidationException: Unable to
> create catalog 'myhive'.
>
> Catalog options are:
> 'hive-conf-dir'='/Users/feng/hive-2.3.6/conf'
> 'type'='hive'
>at
> org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:270)
>at
> org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.createCatalog(LegacyTableEnvironmentInitializer.java:217)
>at
> org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.lambda$initializeCatalogs$1(LegacyTableEnvironmentInitializer.java:120)
>at java.base/java.util.HashMap.forEach(HashMap.java:1336)
>at
> org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.initializeCatalogs(LegacyTableEnvironmentInitializer.java:117)
>at
> org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.initializeSessionState(LegacyTableEnvironmentInitializer.java:105)
>at
> org.apache.flink.table.client.gateway.context.SessionContext.create(SessionContext.java:233)
>at
> org.apache.flink.table.client.gateway.local.LocalContextUtils.buildSessionContext(LocalContextUtils.java:100)
>at
> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:91)
>at org.apache.flink.table.client.SqlClient.start(SqlClient.java:88)
>at
> org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:187)
>... 1 more
> Caused by: org.apache.flink.table.api.TableException: Could not load
> service provider for factories.
>at
> org.apache.flink.table.factories.FactoryUtil.discoverFactories(FactoryUtil.java:507)
>at
> org.apache.flink.table.factories.FactoryUtil.discoverFactory(FactoryUtil.java:298)
>at
> org.apache.flink.table.factories.FactoryUtil.getCatalogFactory(FactoryUtil.java:455)
>at
> org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:251)
>... 11 more
> Caused by: java.util.ServiceConfigurationError:
> org.apache.flink.table.factories.Factory:
> org.apache.flink.table.module.hive.HiveModuleFactory not a subtype
>at java.base/java.util.ServiceLoader.fail(ServiceLoader.java:589)
>at
> java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNextService(ServiceLoader.java:1237)
>at
> java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNext(ServiceLoader.java:1265)
>at
> java.base/java.util.ServiceLoader$2.hasNext(ServiceLoader.java:1300)
>at
> java.base/java.util.ServiceLoader$3.hasNext(ServiceLoader.java:1385)
>at java.base/java.util.Iterator.forEachRemaining(Iterator.java:132)
>at
> org.apache.flink.table.factories.FactoryUtil.discoverFactories(FactoryUtil.java:503)
>... 14 more
>


flink1.13.1 sql client connect hivecatalog 报错

2021-11-18 文章 drewfranklin
Hello, friends ! 
   我按照官方文档使用 sql client 去连接hive catalog 时出错。 
我的hive version 2.3.6
  Flink version 1.13.1


感觉官方介绍的bundled 方式添加jar 包,在flink/lib 下添加如下截图的包。然后重启集群,启动了sql-client 
,连接报错如下,看报错感觉缺包,不知道缺什么包。第二种方式也尝试了下,一样的报错。似乎无法创建catalog 连接。
Yaml 文件:




Reading session environment from: 
file:/Users/feng/flink-1.13.1/bin/../catlog_yaml/hiveCatalog.yaml


Exception in thread "main" org.apache.flink.table.client.SqlClientException: 
Unexpected exception. This is a bug. Please consider filing an issue.
   at 
org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:201)
   at org.apache.flink.table.client.SqlClient.main(SqlClient.java:161)
Caused by: org.apache.flink.table.api.ValidationException: Unable to create 
catalog 'myhive'.

Catalog options are:
'hive-conf-dir'='/Users/feng/hive-2.3.6/conf'
'type'='hive'
   at 
org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:270)
   at 
org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.createCatalog(LegacyTableEnvironmentInitializer.java:217)
   at 
org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.lambda$initializeCatalogs$1(LegacyTableEnvironmentInitializer.java:120)
   at java.base/java.util.HashMap.forEach(HashMap.java:1336)
   at 
org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.initializeCatalogs(LegacyTableEnvironmentInitializer.java:117)
   at 
org.apache.flink.table.client.gateway.context.LegacyTableEnvironmentInitializer.initializeSessionState(LegacyTableEnvironmentInitializer.java:105)
   at 
org.apache.flink.table.client.gateway.context.SessionContext.create(SessionContext.java:233)
   at 
org.apache.flink.table.client.gateway.local.LocalContextUtils.buildSessionContext(LocalContextUtils.java:100)
   at 
org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:91)
   at org.apache.flink.table.client.SqlClient.start(SqlClient.java:88)
   at 
org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:187)
   ... 1 more
Caused by: org.apache.flink.table.api.TableException: Could not load service 
provider for factories.
   at 
org.apache.flink.table.factories.FactoryUtil.discoverFactories(FactoryUtil.java:507)
   at 
org.apache.flink.table.factories.FactoryUtil.discoverFactory(FactoryUtil.java:298)
   at 
org.apache.flink.table.factories.FactoryUtil.getCatalogFactory(FactoryUtil.java:455)
   at 
org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:251)
   ... 11 more
Caused by: java.util.ServiceConfigurationError: 
org.apache.flink.table.factories.Factory: 
org.apache.flink.table.module.hive.HiveModuleFactory not a subtype
   at java.base/java.util.ServiceLoader.fail(ServiceLoader.java:589)
   at 
java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNextService(ServiceLoader.java:1237)
   at 
java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNext(ServiceLoader.java:1265)
   at java.base/java.util.ServiceLoader$2.hasNext(ServiceLoader.java:1300)
   at java.base/java.util.ServiceLoader$3.hasNext(ServiceLoader.java:1385)
   at java.base/java.util.Iterator.forEachRemaining(Iterator.java:132)
   at 
org.apache.flink.table.factories.FactoryUtil.discoverFactories(FactoryUtil.java:503)
   ... 14 more


Re:FlinkSQL ES7连接器无法使用

2021-11-18 文章 mispower
您好,最终该问题是如何解决的?我也遇到了这样的问题

















At 2021-06-10 10:15:58, "mokaful" <649713...@qq.com> wrote:
>org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot
>instantiate user function.
>   at
>org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperatorFactory(StreamConfig.java:338)
>~[flink-dist_2.11-1.13.1.jar:1.13.1]
>   at
>org.apache.flink.streaming.runtime.tasks.OperatorChain.createOperator(OperatorChain.java:653)
>~[flink-dist_2.11-1.13.1.jar:1.13.1]
>   at
>org.apache.flink.streaming.runtime.tasks.OperatorChain.createOperatorChain(OperatorChain.java:626)
>~[flink-dist_2.11-1.13.1.jar:1.13.1]
>   at
>org.apache.flink.streaming.runtime.tasks.OperatorChain.createOutputCollector(OperatorChain.java:566)
>~[flink-dist_2.11-1.13.1.jar:1.13.1]
>   at
>org.apache.flink.streaming.runtime.tasks.OperatorChain.createOperatorChain(OperatorChain.java:616)
>~[flink-dist_2.11-1.13.1.jar:1.13.1]
>   at
>org.apache.flink.streaming.runtime.tasks.OperatorChain.createOutputCollector(OperatorChain.java:566)
>~[flink-dist_2.11-1.13.1.jar:1.13.1]
>   at
>org.apache.flink.streaming.runtime.tasks.OperatorChain.createOperatorChain(OperatorChain.java:616)
>~[flink-dist_2.11-1.13.1.jar:1.13.1]
>   at
>org.apache.flink.streaming.runtime.tasks.OperatorChain.createOutputCollector(OperatorChain.java:566)
>~[flink-dist_2.11-1.13.1.jar:1.13.1]
>   at
>org.apache.flink.streaming.runtime.tasks.OperatorChain.createOperatorChain(OperatorChain.java:616)
>~[flink-dist_2.11-1.13.1.jar:1.13.1]
>   at
>org.apache.flink.streaming.runtime.tasks.OperatorChain.createOutputCollector(OperatorChain.java:566)
>~[flink-dist_2.11-1.13.1.jar:1.13.1]
>   at
>org.apache.flink.streaming.runtime.tasks.OperatorChain.(OperatorChain.java:181)
>~[flink-dist_2.11-1.13.1.jar:1.13.1]
>   at
>org.apache.flink.streaming.runtime.tasks.StreamTask.executeRestore(StreamTask.java:548)
>~[flink-dist_2.11-1.13.1.jar:1.13.1]
>   at
>org.apache.flink.streaming.runtime.tasks.StreamTask.runWithCleanUpOnFail(StreamTask.java:647)
>~[flink-dist_2.11-1.13.1.jar:1.13.1]
>   at
>org.apache.flink.streaming.runtime.tasks.StreamTask.restore(StreamTask.java:537)
>~[flink-dist_2.11-1.13.1.jar:1.13.1]
>   at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:759)
>~[flink-dist_2.11-1.13.1.jar:1.13.1]
>   at org.apache.flink.runtime.taskmanager.Task.run(Task.java:566)
>~[flink-dist_2.11-1.13.1.jar:1.13.1]
>   at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_181]
>Caused by: java.io.InvalidClassException:
>org.apache.flink.streaming.connectors.elasticsearch.table.Elasticsearch7DynamicSink$AuthRestClientFactory;
>local class incompatible: stream classdesc serialVersionUID =
>-2564582543942331131, local class serialVersionUID = -2353232579685349916
>   at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:699)
>~[?:1.8.0_181]
>   at 
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1885)
>~[?:1.8.0_181]
>   at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751)
>~[?:1.8.0_181]
>   at
>java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042)
>~[?:1.8.0_181]
>   at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
>~[?:1.8.0_181]
>   at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
>~[?:1.8.0_181]
>   at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
>~[?:1.8.0_181]
>   at
>java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
>~[?:1.8.0_181]
>   at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
>~[?:1.8.0_181]
>   at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
>~[?:1.8.0_181]
>   at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
>~[?:1.8.0_181]
>   at
>java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
>~[?:1.8.0_181]
>   at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
>~[?:1.8.0_181]
>   at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
>~[?:1.8.0_181]
>   at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
>~[?:1.8.0_181]
>   at
>java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
>~[?:1.8.0_181]
>   at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
>~[?:1.8.0_181]
>   at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
>~[?:1.8.0_181]
>   at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
>~[?:1.8.0_181]
>   at
>java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
>~[?:1.8.0_181]
>   at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
>~[?:1.8.0_181]
>   at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
>~[?:1.8.0_181]
>   at
>org.apache.f