Hi,

In storm we could do thing like:

TopologyBuilder builder = new TopologyBuilder();

builder.setSpout("spout", new NumberSpout());
builder.setBolt(“mybolt", new Mybolt())
        .shuffleGrouping("spout")
        .shuffleGrouping(“mybolt", “iterativeStream");

It means that after one operation there are two output streams. One of them 
will be a input of the operation. Seemly, Flink streaming also supports this 
feature - 
https://ci.apache.org/projects/flink/flink-docs-release-0.10/apis/streaming_guide.html#iterations.

My question is does Spark streaming support this feature also? If yes, how? I 
couldn’t find it from the Internet.

Thanks for your help.
Jun

Reply via email to