Saisai,
thanks a ton :)
Regards,
Gourav
On Mon, Sep 11, 2017 at 11:36 PM, Xiaoye Sun wrote:
> Hi Jerry,
>
> This solves my problem. thanks
>
> On Sun, Sep 10, 2017 at 8:19 PM Saisai Shao
> wrote:
>
>> I guess you're using Capacity Scheduler
Hi Jerry,
This solves my problem. thanks
On Sun, Sep 10, 2017 at 8:19 PM Saisai Shao wrote:
> I guess you're using Capacity Scheduler with DefaultResourceCalculator,
> which doesn't count cpu cores into resource calculation, this "1" you saw
> is actually meaningless.
I guess you're using Capacity Scheduler with DefaultResourceCalculator,
which doesn't count cpu cores into resource calculation, this "1" you saw
is actually meaningless. If you want to also calculate cpu resource, you
should choose DominantResourceCalculator.
Thanks
Jerry
On Sat, Sep 9, 2017 at
Hi,
I am using Spark 1.6.1 and Yarn 2.7.4.
I want to submit a Spark application to a Yarn cluster. However, I found
that the number of vcores assigned to a container/executor is always 1,
even if I set spark.executor.cores=2. I also found the number of tasks an
executor runs concurrently is 2.