Oh god, if we have some code with Accumulator after the env.execute(), this
will not be executed on the JobManager too ?
Thanks, I would be interested indeed !

------------------

Bastien DINE
Data Architect / Software Engineer / Sysadmin
bastiendine.io


Le ven. 23 nov. 2018 à 16:37, Flavio Pompermaier <pomperma...@okkam.it> a
écrit :

> The problem is that the REST API block on env.execute.
> If you want to run your Flink job you have to submit it using the CLI
> client.
> As a workaround we wrote a Spring REST API that to run a job open an SSH
> connection to the job manager and execute the bin/flink run command..
>
> If you're interested in I can share some code..
>
>
>
> On Fri, Nov 23, 2018 at 4:32 PM bastien dine <bastien.d...@gmail.com>
> wrote:
>
>> Hello,
>>
>> I need to chain processing in DataSet API, so I am launching severals
>> jobs, with multiple env.execute() :
>>
>> topology1.define();
>> env.execute;
>>
>> topogy2.define();
>> env.execute;
>>
>> This is working fine when I am running it within IntellIiJ
>> But when I am deploying it into my cluster, it only launch the first
>> topology..
>>
>> Could you please shed some light on this issue?
>>
>> Regards,
>> Bastien
>>
>
>
>

Reply via email to