the response from Gordon.
>
> Piotrek
>
> On 2 Mar 2020, at 12:05, Kaymak, Tobias wrote:
>
> Hi,
>
> let me refine my question: My pipeline is generated from Beam, so the
> Flink pipeline is a translated Beam pipeline. When I update my Apache Beam
> pipeline code
f you are recovering
> to some previous checkpoint, there is nothing Flink can do - some records
> were already committed before.
>
> Piotrek
>
> On 2 Mar 2020, at 10:12, Kaymak, Tobias wrote:
>
> Thank you Piotr!
>
> One last question - let's assume my source is
ly - that
>>> was not stop with savepoint, as stop with savepoint is a relatively new
>>> feature.
>>> 3. Now that we have stop with savepoint (it can be used from CLI as you
>>> wrote), probably we could expose this feature in the new UI as well, unless
>&g
his feature in the new UI as well, unless
>> it’s already exposed somewhere? Yadong, do you know an answer for that?
>>
>> Piotrek
>>
>> On 27 Feb 2020, at 13:31, Kaymak, Tobias
>> wrote:
>>
>> Hello,
>>
>> before Flink 1.9 I was able to
Hello,
before Flink 1.9 I was able to "Stop" a streaming pipeline - after clicking
that button in the webinterface it performed a clean shutdown. Now with
Flink 1.9 I just see the option to cancel it.
However, using the commandline flink stop -d
266c5b38cf9d8e61a398a0bef4a1b350 still does the
Moreover, adding
> dependency of flink-statebackend-rocksdb_2.11 in your pom.xml should be
> enough as it already includes the dependency of rocksdbjni.
>
> [1]
> https://ci.apache.org/projects/flink/flink-docs-stable/ops/config.html#classloader-resolve-order
>
> Best
> Yun Tang
>
Hi,
I am using Apache Beam 2.14.0 with Flink 1.8.0 and I have included the
RocksDb dependency in my projects pom.xml as well as baked it into the
Dockerfile like this:
FROM flink:1.8.0-scala_2.11
ADD --chown=flink:flink
plication cluster (as
> documented here:
> https://ci.apache.org/projects/flink/flink-docs-release-1.8/ops/deployment/docker.html#flink-job-cluster
> )?
>
> – Ufuk
>
>
> On Tue, Aug 6, 2019 at 1:50 PM Kaymak, Tobias
> wrote:
>
>> I was using Apache Beam and in the l
I was using Apache Beam and in the lib folder I had a JAR that was using
Flink 1.7 in its POM. After bumping that to 1.8 it works :)
On Tue, Aug 6, 2019 at 11:58 AM Kaymak, Tobias
wrote:
> It completely works when using the docker image tag 1.7.2 - I just bumped
> back and the web int
It completely works when using the docker image tag 1.7.2 - I just bumped
back and the web interface was there.
On Tue, Aug 6, 2019 at 10:21 AM Kaymak, Tobias
wrote:
> Hello,
>
> after upgrading the docker image from version 1.7.2 to 1.8.1 and wiping
> out zookeeper comp
Hello,
after upgrading the docker image from version 1.7.2 to 1.8.1 and wiping
out zookeeper completely I see
{"errors":["Not found."]}
when trying to access the webinterface of Flink. I can launch jobs from the
cmdline and I can't spot any error in the logs (so far on level INFO). I
tried
11 matches
Mail list logo