Yes, in the sense that you can create and trigger an action on as many RDDs created from the batch's RDD that you like.
On Thu, Dec 3, 2015 at 8:04 PM, Wang Yangjun <yangjun.w...@aalto.fi> wrote: > Hi, > > In storm we could do thing like: > > TopologyBuilder builder = new TopologyBuilder(); > > builder.setSpout("spout", new NumberSpout()); > builder.setBolt(“mybolt", new Mybolt()) > .shuffleGrouping("spout") > .shuffleGrouping(“mybolt", “iterativeStream"); > > It means that after one operation there are two output streams. One of them > will be a input of the operation. Seemly, Flink streaming also supports this > feature - > https://ci.apache.org/projects/flink/flink-docs-release-0.10/apis/streaming_guide.html#iterations. > > My question is does Spark streaming support this feature also? If yes, how? > I couldn’t find it from the Internet. > > Thanks for your help. > Jun > --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org