Thanks for the information.

Are there any plans to implement this? It is supported on other docker
images...

On Tue, 8 Dec 2020 at 9:36 PM, Fabian Paul <fabianp...@data-artisans.com>
wrote:

> Hi Narasimha,
>
> I investigated your problem and it is caused by multiple issues. First vvp
> in
> general cannot really handle multi job submissions per jar because the
> complete
> deployment lifecycle in vvp is scoped around a single Flink job id.
> Therefore vvp sets a generated Flink job id during submission which leads
> to
> that only a single job is executed [1].
> Secondly in your example you are using `execute()` on the StreamEnvironment
> which is a blocking operation. You need to use `executeAsync()` to execute
> the
> jobs simultanously.
>
> Best,
> Fabian
>
> [1]
> https://github.com/apache/flink/blob/c2972b6e336cc3b3a6cbd22c69a6710dab5246e6/flink-clients/src/main/java/org/apache/flink/client/deployment/application/ApplicationDispatcherBootstrap.java#L178
>
> On 6. Dec 2020, at 15:10, narasimha <swamy.haj...@gmail.com> wrote:
>
> thanks Fabian for responding.
>
> flink image : registry.ververica.com/v2.2/flink:1.11.1-stream1-scala_2.12
>
> There are no errors as such. But it is just considering the first job.
>
>
> On Thu, Dec 3, 2020 at 5:34 PM Fabian Paul <fabianp...@data-artisans.com>
> wrote:
>
>> Hi Narasimha,
>>
>> Nothing comes to my mind immediately why it should not work. We are using
>> the StandaloneApplicationClusterEntryPoint to start the cluster. Can you
>> provide some more information about which Flink image on vvp are you trying
>> to use and maybe show the error message?
>>
>> Best,
>> Fabian
>
>
>
> --
> A.Narasimha Swamy
>
>
> --
Sent from my iPhone

Reply via email to