Re: Providing external files to flink classpath

2019-06-28 Thread Yun Tang
Hi Vishwas 1. You could use '-yt' to ship specified files to the class path, please refer to [1] for more details. 2. If the properties are only loaded on client side before executing the application, you could let your application to just read from local property data. Flink support to

Re: LookupableTableSource question

2019-06-28 Thread JingsongLee
The keys means joint primary keys, it is not list of keys, in your case, maybe there is a single key? Best, Jingsong Lee 来自阿里邮箱 iPhone版 --Original Mail -- From:Flavio Pompermaier Date:2019-06-28 22:53:31 Recipient:JingsongLee CC:user Subject:Re:

Providing external files to flink classpath

2019-06-28 Thread Vishwas Siravara
Hi , I am trying to add external property files to the flink classpath for my application. These files are not a part of the fat jar. I put them under the lib folder but flink cant find them? How can I manage external property files that needs to be read by flink ? Thanks, Vishwas

Re: LookupableTableSource question

2019-06-28 Thread Flavio Pompermaier
Sorry I copied and pasted twice the current eval method...I'd do this: public void eval(Object... keys) { for (Object kkk : keys) { Row keyRow = Row.of(kkk); if (cache != null) { List cachedRows = cache.getIfPresent(keyRow); if (cachedRows != null) {

Re: LookupableTableSource question

2019-06-28 Thread Flavio Pompermaier
This could be a good fit, I'll try to dig into it and see if it can be adapted to a REST service. The only strange thing I see is that the key of the local cache is per block of keys..am I wrong? Shouldn't it cycle over the list of passed keys? Right now it's the following: Cache> cache; public

Re: Apache Flink - Running application with different flink configurations (flink-conf.yml) on EMR

2019-06-28 Thread Jeff Zhang
This is due to flink doesn't unify the execution in different enviroments. The community has discuss it before about how to enhance the flink client api. The initial proposal is to introduce FlinkConf which contains all the configuration so that we can unify the executions in all environments

Re: Apache Flink - Running application with different flink configurations (flink-conf.yml) on EMR

2019-06-28 Thread Xintong Song
Hi, Singh, I don't think that should work. The -D or -yD parameters needs to be passed to the Flink start-up scripts or the "flink run" command. I don't think the IntelliJ VM arguments are equivalent to that. In fact, I'm not aware of any method to set "-D" parameters when running jobs IDE.

Re: LookupableTableSource question

2019-06-28 Thread JingsongLee
Hi Flavio: I just implement a JDBCLookupFunction[1]. You can use it as table function[2]. Or use blink temporal table join[3] (Need blink planner support). I add a google guava cache in JDBCLookupFunction with configurable cacheMaxSize (avoid memory OOM) and cacheExpireMs(For the fresh of

LookupableTableSource question

2019-06-28 Thread Flavio Pompermaier
Hi to all, I have a use case where I'd like to enrich a stream using a rarely updated lookup table. Basically, I'd like to be able to set a refresh policy that is triggered either when a key was not found (a new key has probably been added in the mean time) or a configurable refresh-period has

Re:Re: Re:Flink1.8+Hadoop3.1.2 编译问题

2019-06-28 Thread USERNAME
您好, 非常感谢 唐老师 您的回复,问题已解决。谢谢! 在 2019-06-28 15:59:48,"Yun Tang" 写道: >你好 > >因为从Flink-1.8 开始,flink的默认编译选项里面就不再带上hadoop依赖了。可以参考[1] >了解更多信息。实际上从官方的下载链接[2]里面也说明了从Flink-1.8开始shaded-hadoop的相关jar包需要单独下载并放置在lib目录下。 > >如果需要shaded-hadoop jar包,可以单独去编译好的 flink-shaded-hadoop 子项目目录下找到相关的jar包。 > >[1]

Re: Debug Kryo.Serialization Exception

2019-06-28 Thread Flavio Pompermaier
Indeed looking at StreamElementSerializer the duplicate() method could be bugged: @Override public StreamElementSerializer duplicate() { TypeSerializer copy = typeSerializer.duplicate(); return (copy == typeSerializer) ? this : new StreamElementSerializer(copy); } Is ti safe to

Re: Re:Flink1.8+Hadoop3.1.2 编译问题

2019-06-28 Thread Yun Tang
你好 因为从Flink-1.8 开始,flink的默认编译选项里面就不再带上hadoop依赖了。可以参考[1] 了解更多信息。实际上从官方的下载链接[2]里面也说明了从Flink-1.8开始shaded-hadoop的相关jar包需要单独下载并放置在lib目录下。 如果需要shaded-hadoop jar包,可以单独去编译好的 flink-shaded-hadoop 子项目目录下找到相关的jar包。 [1] https://issues.apache.org/jira/browse/FLINK-11266 [2]

Re: How to run Graph algorithms

2019-06-28 Thread Robert Metzger
Hey, this page explains how to run a Flink job: https://ci.apache.org/projects/flink/flink-docs-master/getting-started/tutorials/local_setup.html On Sat, May 25, 2019 at 1:28 PM RAMALINGESWARA RAO THOTTEMPUDI < tr...@iitkgp.ac.in> wrote: > > Respected All, > I am a new learner of Apache Flink. I

Re: Debug Kryo.Serialization Exception

2019-06-28 Thread Flavio Pompermaier
Hi Fabian, we had similar errors with Flink 1.3 [1][2] and the error was caused by the fact that a serialised was sharing the same object with multiple threads. The error was not deterministic and was changing from time to time. So maybe it could be something similar (IMHO). [1]

Re: Re:Flink batch job memory/disk leak when invoking set method on a static Configuration object.

2019-06-28 Thread Vadim Vararu
Hi, I've run it on a standalone Flink cluster. No Yarn involved. From: Haibo Sun Sent: Friday, June 28, 2019 6:13 AM To: Vadim Vararu Cc: user@flink.apache.org Subject: Re:Flink batch job memory/disk leak when invoking set method on a static Configuration

Flink1.8+Hadoop3.1.2 编译问题

2019-06-28 Thread USERNAME
1.软件版本 Flink 1.8 Hadoop 3.1.2 Apache Maven 3.0.5 2.操作方式 >git clone -b release-1.8.0 https://github.com/apache/flink >cd flink >mvn clean install -DskipTests -Dhadoop.version=3.1.2 3.问题 编译成功之后 .flink/build-target/lib 目录只有三个文件(↓) 正常的Flink1.7.2的编译结果(↓) 有没有遇到过此问题的??

Re:Re: checkpoint stage size的问题

2019-06-28 Thread CHENJIE
你好, 如果有需要session窗口可能保持很长时间,数据量也很大,这种窗口会导致checkpoint stage size变的非常大 有没有一种机制可能让超过一定时间的状态失效并且丢弃掉? 在 2019-06-26 16:23:13,"Yun Tang" 写道: >你好 > >这个问题问得有点稍微宽泛,因为并没有描述你所认为的checkpoint state size越来越大的周期。checkpoint state >size变大有几个原因: > > 1. 上游数据量增大。 > 2.

Re: Debug Kryo.Serialization Exception

2019-06-28 Thread Fabian Wollert
additionally we have these coming with this as well all the time: com.esotericsoftware.kryo.KryoException: java.lang.ArrayIndexOutOfBoundsException Serialization trace: _children (com.fasterxml.jackson.databind.node.ObjectNode) at