Re: Running 2 spark application in parallel

2015-11-01 Thread Akhil Das
Have a look at the dynamic resource allocation listed here
https://spark.apache.org/docs/latest/job-scheduling.html

Thanks
Best Regards

On Thu, Oct 22, 2015 at 11:50 PM, Suman Somasundar <
suman.somasun...@oracle.com> wrote:

> Hi all,
>
>
>
> Is there a way to run 2 spark applications in parallel under Yarn in the
> same cluster?
>
>
>
> Currently, if I submit 2 applications, one of them waits till the other
> one is completed.
>
>
>
> I want both of them to start and run at the same time.
>
>
>
> Thanks,
> Suman.
>


Re: Running 2 spark application in parallel

2015-10-23 Thread Debasish Das
You can run 2 threads in driver and spark will fifo schedule the 2 jobs on
the same spark context you created (executors and cores)...same idea is
used for spark sql thriftserver flow...

For streaming i think it lets you run only one stream at a time even if you
run them on multiple threads on driver...have to double check...
On Oct 22, 2015 11:41 AM, "Simon Elliston Ball" 
wrote:

> If yarn has capacity to run both simultaneously it will. You should ensure
> you are not allocating too many executors for the first app and leave some
> space for the second)
>
> You may want to run the application on different yarn queues to control
> resource allocation. If you run as a different user within the same queue
> you should also get an even split between the applications, however you may
> need to enable preemption to ensure the first doesn't just hog the queue.
>
> Simon
>
> On 22 Oct 2015, at 19:20, Suman Somasundar 
> wrote:
>
> Hi all,
>
>
>
> Is there a way to run 2 spark applications in parallel under Yarn in the
> same cluster?
>
>
>
> Currently, if I submit 2 applications, one of them waits till the other
> one is completed.
>
>
>
> I want both of them to start and run at the same time.
>
>
>
> Thanks,
> Suman.
>
>


Re: Running 2 spark application in parallel

2015-10-22 Thread Simon Elliston Ball
If yarn has capacity to run both simultaneously it will. You should ensure you 
are not allocating too many executors for the first app and leave some space 
for the second)

You may want to run the application on different yarn queues to control 
resource allocation. If you run as a different user within the same queue you 
should also get an even split between the applications, however you may need to 
enable preemption to ensure the first doesn't just hog the queue. 

Simon 

> On 22 Oct 2015, at 19:20, Suman Somasundar  
> wrote:
> 
> Hi all,
>  
> Is there a way to run 2 spark applications in parallel under Yarn in the same 
> cluster?
>  
> Currently, if I submit 2 applications, one of them waits till the other one 
> is completed.
>  
> I want both of them to start and run at the same time.
>  
> Thanks,
> Suman.


Running 2 spark application in parallel

2015-10-22 Thread Suman Somasundar
Hi all,

 

Is there a way to run 2 spark applications in parallel under Yarn in the same 
cluster?

 

Currently, if I submit 2 applications, one of them waits till the other one is 
completed.

 

I want both of them to start and run at the same time.

 

Thanks,
Suman.