Hi,

I am trying to build a functionality to dynamically configure a flink job
(Java) in my code based on some additional metadata and submit it to a
flink running in a session cluster.

Flink version is 1.11.2

The problem I have is how to provide a packed job to the cluster. When I am
trying the following code

StreamExecutionEnvironment env =
StreamExecutionEnvironment.createRemoteEnvironment(hostName,
hostPort);
... configuring job workflow here...
env.execute(jobName);

I am getting ClassNotFoundException stating that code for my mapping
functions did not make it to the cluster. Which makes sense.

What would be the right way to deploy dynamically configured flink jobs
which are not packaged as a jar file but rather generated ad-hoc?

Thanks

Reply via email to