Hi,

I have a Kafka topic that contains dozens of different types of messages.
And for each one I'll need to create a DStream for it.

Currently I have to filter the Kafka stream over and over, which is very
inefficient.

So what's the best way to do dispatching in Spark Streaming? (one DStream
-> multiple DStreams)


Thanks,
-- 
Jianshi Huang

LinkedIn: jianshi
Twitter: @jshuang
Github & Blog: http://huangjs.github.com/

Reply via email to