Hi Fabian, thanks for the response. So my mains should be converted in a method returning the ExecutionEnvironment. However it think that it will be very nice to have a syntax like the one of the Hadoop ProgramDriver to define jobs to invoke from a single root class. Do you think it could be useful?
On Fri, May 8, 2015 at 12:42 PM, Fabian Hueske <[email protected]> wrote: > You easily have multiple Flink programs in a single JAR file. > A program is defined using an ExecutionEnvironment and executed when you > call ExecutionEnvironment.exeucte(). > Where and how you do that does not matter. > > You can for example implement a main function such as: > > public static void main(String... args) { > > if (today == Monday) { > ExecutionEnvironment env = ... > // define Monday prog > env.execute() > } > else { > ExecutionEnvironment env = ... > // define other prog > env.execute() > } > } > > 2015-05-08 11:41 GMT+02:00 Flavio Pompermaier <[email protected]>: > >> Hi to all, >> is there any way to keep multiple jobs in a jar and then choose at >> runtime the one to execute (like what ProgramDriver does in Hadoop)? >> >> Best, >> Flavio >> >> >
