Hi Weihua and Shammon,

Thanks for the pointers.I tried both, unfortunately neither works.

By enabling "execution.attached", there doesn't seem to be any difference
than the default settings. doSomeCleanupTasks() is called right away while
the pipeline is still running; and env.executeAsync().getJobStatus() causes
an exception:
    org.apache.flink.util.FlinkRuntimeException: The Job Status cannot be
requested when in Web Submission.

FYI, I am using 1.15 and the job is submitted with */jars/:jarid/run*

Regards,
Luke

On Fri, May 12, 2023 at 1:32 AM Weihua Hu <huweihua....@gmail.com> wrote:

>
> Hi, Luke
>
> You can enable "execution.attached", then env.execute() will wait until
> the job is finished.
>
> [1]
> https://nightlies.apache.org/flink/flink-docs-master/docs/deployment/config/#execution-attached
>
> Best,
> Weihua
>
>
> On Fri, May 12, 2023 at 8:59 AM Shammon FY <zjur...@gmail.com> wrote:
>
>> Hi Luke,
>>
>> Maybe you can get 'JobClient' after submit the job and check the job
>> status with 'JobClient.getJobStatus()'
>>
>> Best,
>> Shammon FY
>>
>>
>> On Fri, May 12, 2023 at 2:58 AM Luke Xiong <leix...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> My flink job needs to do something when the pipeline execution has
>>> ended. The job code is like this:
>>>
>>> createSomeStream().applySomeOperators();
>>> env.execute(jobName);
>>> doSomeCleanupTasks();
>>>
>>> It looks like doSomeCleanupTasks() can be called while the pipeline is
>>> still running. The job is for processing a bounded stream, so it doesn't
>>> run forever. Is it possible to achieve this so doSomeCleanupTasks is called
>>> only when the pipeline has processed all the data? This happens when the
>>> runtime mode is STREAMING. Would running it in BATCH mode make any
>>> difference?
>>>
>>> Regards,
>>> Luke
>>>
>>>
>>>

Reply via email to