Re: Concurreny does not improve for Spark Jobs with Same Spark Context

2016-02-18 Thread Prabhu Joseph
Fair Scheduler, YARN Queue has the entire cluster resource as maxResource, preemption does not come into picture during test case, all the spark jobs got the requested resource. The concurrent jobs with different spark context runs fine, so suspecting on resource contention is not a correct one.

Re: Concurreny does not improve for Spark Jobs with Same Spark Context

2016-02-18 Thread Ted Yu
Is it possible to perform the tests using Spark 1.6.0 ? Thanks On Thu, Feb 18, 2016 at 9:51 PM, Prabhu Joseph wrote: > Hi All, > >When running concurrent Spark Jobs on YARN (Spark-1.5.2) which share a > single Spark Context, the jobs take more time to complete

Re: Concurreny does not improve for Spark Jobs with Same Spark Context

2016-02-18 Thread Jörn Franke
How did you configure YARN queues? What scheduler? Preemption ? > On 19 Feb 2016, at 06:51, Prabhu Joseph wrote: > > Hi All, > >When running concurrent Spark Jobs on YARN (Spark-1.5.2) which share a > single Spark Context, the jobs take more time to complete

Concurreny does not improve for Spark Jobs with Same Spark Context

2016-02-18 Thread Prabhu Joseph
Hi All, When running concurrent Spark Jobs on YARN (Spark-1.5.2) which share a single Spark Context, the jobs take more time to complete comparing with when they ran with different Spark Context. The spark jobs are submitted on different threads. Test Case: A. 3 spark jobs submitted