Re: Multiple vcores per container when running Spark applications in Yarn cluster mode

2017-09-11 Thread Gourav Sengupta
Saisai, thanks a ton :) Regards, Gourav On Mon, Sep 11, 2017 at 11:36 PM, Xiaoye Sun wrote: > Hi Jerry, > > This solves my problem.  thanks > > On Sun, Sep 10, 2017 at 8:19 PM Saisai Shao > wrote: > >> I guess you're using Capacity Scheduler

Re: Multiple vcores per container when running Spark applications in Yarn cluster mode

2017-09-11 Thread Xiaoye Sun
Hi Jerry, This solves my problem.  thanks On Sun, Sep 10, 2017 at 8:19 PM Saisai Shao wrote: > I guess you're using Capacity Scheduler with DefaultResourceCalculator, > which doesn't count cpu cores into resource calculation, this "1" you saw > is actually meaningless.

Re: Multiple vcores per container when running Spark applications in Yarn cluster mode

2017-09-10 Thread Saisai Shao
I guess you're using Capacity Scheduler with DefaultResourceCalculator, which doesn't count cpu cores into resource calculation, this "1" you saw is actually meaningless. If you want to also calculate cpu resource, you should choose DominantResourceCalculator. Thanks Jerry On Sat, Sep 9, 2017 at

Multiple vcores per container when running Spark applications in Yarn cluster mode

2017-09-08 Thread Xiaoye Sun
Hi, I am using Spark 1.6.1 and Yarn 2.7.4. I want to submit a Spark application to a Yarn cluster. However, I found that the number of vcores assigned to a container/executor is always 1, even if I set spark.executor.cores=2. I also found the number of tasks an executor runs concurrently is 2.