Hi,
by now I understood maybe a bit better how spark-submit and YARN play
together and how Spark driver and slaves play together on YARN.
Now for my usecase, as described on
https://spark.apache.org/docs/latest/submitting-applications.html, I would
probably have a end-user-facing gateway that
Hi,
On Thu, Sep 4, 2014 at 10:33 AM, Tathagata Das tathagata.das1...@gmail.com
wrote:
In the current state of Spark Streaming, creating separate Java processes
each having a streaming context is probably the best approach to
dynamically adding and removing of input sources. All of these
Hi,
I am not sure if multi-tenancy is the right word, but I am thinking about
a Spark application where multiple users can, say, log into some web
interface and specify a data processing pipeline with streaming source,
processing steps, and output.
Now as far as I know, there can be only one
In the current state of Spark Streaming, creating separate Java processes
each having a streaming context is probably the best approach to
dynamically adding and removing of input sources. All of these should be
able to to use a YARN cluster for resource allocation.
On Wed, Sep 3, 2014 at 6:30