Hi:
Now I use Flink 1.1.4 release in standalone cluster model. I want to do
the Kerberos authentication between Flink CLI and the Jobmanager. But in the
flink-conf.yaml, there is no Flink cluster security configuration.
Does the Kerberos authentication works in Flink 1.1.4 release?
Thanks in
Hi,
I have a flink job that I can trigger a save point for with no problem.
However, If I cancel the job then try to run it with the save point, I get the
following exception. Any ideas how can I debug or fix it? I am using the exact
same jar so I did not modify the program in any manner. Using
Hi!
It is probably some inconsistent configuration in the IDE.
It often helps to do "Maven->Reimport" or use "restart and clear caches".
On Tue, Jan 3, 2017 at 9:48 AM, Stephan Epping
wrote:
> Hey,
>
>
> thanks for the reply. I didn’t change the scala version, as it worked before.
> I just c
Hi,
It seems your questions are too abstract & theoretical. The answer is : it
depends on several factors. Skewness in data, data volume, reliability
requirements, "fatness" of servers, whether one performs look-up in other
data sources, etc.
The papers you mentioned mean the following: under concr
Hi,
First of all, I wish everybody a happy new year 2017.
I've set user@flink in CC so that users who are interested in helping with
the testing get notified. Please respond only to the dev@ list to keep the
discussion there!
According to the 1.2 release discussion thread, I've created a first
r
Hi,
Actually it seems "Fold cannot be used with a merging WindowAssigner" and
workaround I found was to use generic window function. It seems that I
would need to use the window apply anyway. Functionality is then all there,
but I am really concerned on the resource utilisations. We have quite man
Happy new year everyone :)
I’m currently working on a paper about Flink. I already got some
recommendations on general papers with details about Flink, which helped me a
lot already. But now that I read them, I’m further interested is the speedup
capabilities, provided by the Flink Framework: H
Indeed, that looks like what we need. We currently rely on flink 1.0.1, that
feature might be a good reason to update our flink version. I’ll test it. Many
thanks ! !
From: Jamie Grier [mailto:ja...@data-artisans.com]
Sent: lundi 2 janvier 2017 20:56
To: user@flink.apache.org
Subject: Re: Progra
Hey,
thanks for the reply. I didn’t change the scala version, as it worked before. I
just changed the flink version in my pom.xml thats it, a one line change.
Maybe you could elaborate a bit more, what I can do to change the scala version?
best Stephan
> On 03 Jan 2017, at 03:11, Kurt Young w