You can run 2 threads in driver and spark will fifo schedule the 2 jobs on
the same spark context you created (executors and cores)...same idea is
used for spark sql thriftserver flow...

For streaming i think it lets you run only one stream at a time even if you
run them on multiple threads on driver...have to double check...
On Oct 22, 2015 11:41 AM, "Simon Elliston Ball" <si...@simonellistonball.com>
wrote:

> If yarn has capacity to run both simultaneously it will. You should ensure
> you are not allocating too many executors for the first app and leave some
> space for the second)
>
> You may want to run the application on different yarn queues to control
> resource allocation. If you run as a different user within the same queue
> you should also get an even split between the applications, however you may
> need to enable preemption to ensure the first doesn't just hog the queue.
>
> Simon
>
> On 22 Oct 2015, at 19:20, Suman Somasundar <suman.somasun...@oracle.com>
> wrote:
>
> Hi all,
>
>
>
> Is there a way to run 2 spark applications in parallel under Yarn in the
> same cluster?
>
>
>
> Currently, if I submit 2 applications, one of them waits till the other
> one is completed.
>
>
>
> I want both of them to start and run at the same time.
>
>
>
> Thanks,
> Suman.
>
>

Reply via email to