Hi Jeff,

Unfortunately this is not good enough for me.
My clients are very volatile, they start a batch and can go away any moment 
without waiting for it to finish. Think of an elastic web application or an AWS 
Lambda.

I hopped to find something what could be deployed to the cluster together with 
the batch code. Maybe a hook to a job manager or similar. I do not plan to run 
anything heavy there, just some formal cleanups.
Is there something like this?

Thank you!

  Mark

‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐
On Saturday, June 6, 2020 4:29 PM, Jeff Zhang <zjf...@gmail.com> wrote:

> It would run in the client side where ExecutionEnvironment is created.
>
> Mark Davis <moda...@protonmail.com> 于2020年6月6日周六 下午8:14写道:
>
>> Hi Jeff,
>>
>> Thank you very much! That is exactly what I need.
>>
>> Where the listener code will run in the cluster deployment(YARN, k8s)?
>> Will it be sent over the network?
>>
>> Thank you!
>>
>>   Mark
>>
>> ‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐
>> On Friday, June 5, 2020 6:13 PM, Jeff Zhang <zjf...@gmail.com> wrote:
>>
>>> You can try JobListener which you can register to ExecutionEnvironment.
>>>
>>> https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/core/execution/JobListener.java
>>>
>>> Mark Davis <moda...@protonmail.com> 于2020年6月6日周六 上午12:00写道:
>>>
>>>> Hi there,
>>>>
>>>> I am running a Batch job with several outputs.
>>>> Is there a way to run some code(e.g. release a distributed lock) after all 
>>>> outputs are finished?
>>>>
>>>> Currently I do this in a try-finally block around 
>>>> ExecutionEnvironment.execute() call, but I have to switch to the detached 
>>>> execution mode - in this mode the finally block is never run.
>>>>
>>>> Thank you!
>>>>
>>>>   Mark
>>>
>>> --
>>> Best Regards
>>>
>>> Jeff Zhang
>
> --
> Best Regards
>
> Jeff Zhang

Reply via email to