Hi, Luke

You can enable "execution.attached", then env.execute() will wait until the
job is finished.

[1]
https://nightlies.apache.org/flink/flink-docs-master/docs/deployment/config/#execution-attached

Best,
Weihua


On Fri, May 12, 2023 at 8:59 AM Shammon FY <zjur...@gmail.com> wrote:

> Hi Luke,
>
> Maybe you can get 'JobClient' after submit the job and check the job
> status with 'JobClient.getJobStatus()'
>
> Best,
> Shammon FY
>
>
> On Fri, May 12, 2023 at 2:58 AM Luke Xiong <leix...@gmail.com> wrote:
>
>> Hi,
>>
>> My flink job needs to do something when the pipeline execution has ended.
>> The job code is like this:
>>
>> createSomeStream().applySomeOperators();
>> env.execute(jobName);
>> doSomeCleanupTasks();
>>
>> It looks like doSomeCleanupTasks() can be called while the pipeline is
>> still running. The job is for processing a bounded stream, so it doesn't
>> run forever. Is it possible to achieve this so doSomeCleanupTasks is called
>> only when the pipeline has processed all the data? This happens when the
>> runtime mode is STREAMING. Would running it in BATCH mode make any
>> difference?
>>
>> Regards,
>> Luke
>>
>>
>>

Reply via email to