Hi,
In short, [1] means whether the job will trigger checkpoints, and [2] means
which operators will take action when checkpoints are triggered.
If use ExampleCountSource, flink-streaming-java should be a dependency in
pom.xml and classes such as ListState, ListStateDescriptor,
你好,org.apache.flink.yarn.entrypoint.YarnJobClusterEntrypoint 这个类应该是在
flink-yarn 这个 module 里面,打 lib 包的时候作为依赖被打进 flink-dist 里面。为什么你同时添加了
flink-dist_2.11-1.10.1.jar 和 flink-yarn_2.11-1.11.1.jar 这两个 jar
呀,不会冲突吗?Smile
在 2021-02-23 19:27:43,"凌战" 写道:
>上面添加的jar包没有显示,补充一下:目前除了用户jar包,添加的依赖jar包就是
HADOOP_CLASSPATH 看下环境变量的内容是什么,是否和 "hadoop classpath" 这个语句执行的结果一致?
根据 [1],Flink 1.11 开始,不再默认把 HDFS 相关的 jar 包打进 Flink 的包里面了,而是需要用户在执行时指定 HDFS
相关包路径,export HADOOP_CLASSPATH=`hadoop classpath` 这句话实际上的效果是执行 hadoop classpath
命令并将结果赋值给 HADOOP_CLASSPATH 这个系统变量。
另外控制变量的话你找个最简单的作业提交一下看看?
看上面的错误日志你提交的应该是个
用 left join 或者 full join?这样的话关联不上的数据在区间结束的时候也会被输出,对侧表的字段用 null 填充。目前
DataStream API 里面 Interval Join 还不支持 outer join,不过 Table API/SQL
是支持的,参考[1]。[1].
https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/table/tableApi.html#joins
在 2021-02-08 19:05:56,"lxk7...@163.com" 写道:
>
2021 at 8:25 AM Smile@LETTers wrote:
Hi Matthias,
Sorry for my miss leading. I mean kafka-schema-serializer rather than
kafka-avro-serializer.
io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe is in
kafka-schema-serializer and kafka-schema-serializer should be a dependency of
kafka-avro-
ncy you're talking about is already part of
flink-end-to-end-tests/flink-end-to-end-tests-common-kafka/pom.xml. It also has
the correct scope set both in master and release-1.12.
Best,
Matthias
On Fri, Jan 22, 2021 at 10:04 AM Smile@LETTers wrote:
Yes, I've tried from both the root directory a
iling when you build Flink from the root
directory (not calling maven from within a maven module?)
On Tue, Jan 19, 2021 at 11:19 AM Smile@LETTers wrote:
Hi,
I got an error when tried to compile & package Flink (version 1.12 & current
master).
It can be reproduced by run 'mvn clean
Hi,
I got an error when tried to compile & package Flink (version 1.12 & current
master).
It can be reproduced by run 'mvn clean test' under
flink-end-to-end-tests/flink-end-to-end-tests-common-kafka.
It seems that a necessary dependency for test scope was missing and some
classes can not be