Hi guys.

Is there a more lightweight way of stream processing with Spark? What we
want is a simpler way, preferably with no scheduling, which just streams
the data to destinations multiple.

We extensively use Spark Core, SQL, Streaming, GraphX, so it's our main
tool and don't want to add new things to the stack like Storm or Flume, but
from other side, it really takes much more resources on same streaming than
our previous setup with Flume, especially if we have multiple destinations
(triggers multiple actions/scheduling)


-- 
RGRDZ Harut

Reply via email to