Have a look at the dynamic resource allocation listed here
https://spark.apache.org/docs/latest/job-scheduling.html
Thanks
Best Regards
On Thu, Oct 22, 2015 at 11:50 PM, Suman Somasundar <
suman.somasun...@oracle.com> wrote:
> Hi all,
>
>
>
> Is there a way to run 2 spark applications in
You can run 2 threads in driver and spark will fifo schedule the 2 jobs on
the same spark context you created (executors and cores)...same idea is
used for spark sql thriftserver flow...
For streaming i think it lets you run only one stream at a time even if you
run them on multiple threads on
If yarn has capacity to run both simultaneously it will. You should ensure you
are not allocating too many executors for the first app and leave some space
for the second)
You may want to run the application on different yarn queues to control
resource allocation. If you run as a different
Hi all,
Is there a way to run 2 spark applications in parallel under Yarn in the same
cluster?
Currently, if I submit 2 applications, one of them waits till the other one is
completed.
I want both of them to start and run at the same time.
Thanks,
Suman.