You easily have multiple Flink programs in a single JAR file.
A program is defined using an ExecutionEnvironment and executed when you
call ExecutionEnvironment.exeucte().
Where and how you do that does not matter.
You can for example implement a main function such as:
public static void main(String... args) {
if (today == Monday) {
ExecutionEnvironment env = ...
// define Monday prog
env.execute()
}
else {
ExecutionEnvironment env = ...
// define other prog
env.execute()
}
}
2015-05-08 11:41 GMT+02:00 Flavio Pompermaier <[email protected]>:
> Hi to all,
> is there any way to keep multiple jobs in a jar and then choose at runtime
> the one to execute (like what ProgramDriver does in Hadoop)?
>
> Best,
> Flavio
>
>