Hey guys,

I am trying contain my excitement, but I am failing :)

So today I implemented a basic prototype for connecting batch processing
and streaming jobs as a good starting point for a 'lambda architecture
api'. And most surprisingly it works :) It is very simple so far (currently
only works for map operator and chaining is turned off) and some hacking
along the way, but I hope you will like it.

You can see the commit here:
https://github.com/mbalassi/incubator-flink/commit/7bfc16a7803e1f2cc4b38c746af069aab4450637

And a sample job:
https://github.com/mbalassi/incubator-flink/blob/7bfc16a7803e1f2cc4b38c746af069aab4450637/flink-addons/flink-streaming/flink-streaming-core/src/test/java/org/apache/flink/streaming/api/LambdaTest.java

And for the others, it looks something like this:

 ExecutionEnvironment batchEnv = ExecutionEnvironment.
*createLocalEnvironment*(1);
StreamExecutionEnvironment streamEnv = StreamExecutionEnvironment.
*createLocalEnvironment*(1);

DataSet<Long> dataSet = batchEnv.generateSequence(1, 10).map(*new*
Increment());

DataStream<Long> dataStream = streamEnv.generateSequence(10, 20);

  *dataStream.lambdaJoin(dataSet).map(new Increment()).print();*
dataSet.print();

 * LambdaEnvironment lambdaEnv = new LambdaEnvironment(batchEnv,
streamEnv);*

* lambdaEnv.execute(":)");*


I hope I could get you excited too :)

Cheers,

Gyula

Reply via email to