Hi,
Say we have 4 nodes with 2 cores each in stand alone mode. I'd like to dedicate
4 cores to a streaming application. I can do this via spark submit by:
spark-submit --total-executor-cores 4
However, this assigns one core per machine. I would like to use 2 cores on 2
machines instead,
I think you want:
--num-executors 2 --executor-cores 2
On Mon, Dec 22, 2014 at 10:39 AM, Ashic Mahtab as...@live.com wrote:
Hi,
Say we have 4 nodes with 2 cores each in stand alone mode. I'd like to
dedicate 4 cores to a streaming application. I can do this via spark submit
by:
Hi Sean,
Thanks for the response.
It seems --num-executors is ignored. Specifying --num-executors 2
--executor-cores 2 is giving the app all 8 cores across 4 machines.
-Ashic.
From: so...@cloudera.com
Date: Mon, 22 Dec 2014 10:57:31 +
Subject: Re: Using more cores on machines
machine on 2 machines (so 4 cores in total) while not using the other
two machines.
Regards,
Ashic.
From: j...@soundcloud.com
Date: Mon, 22 Dec 2014 17:36:26 +0100
Subject: Re: Using more cores on machines
To: as...@live.com
CC: so...@cloudera.com; user@spark.apache.org
AFAIK, `--num
...@soundcloud.com
Date: Mon, 22 Dec 2014 17:36:26 +0100
Subject: Re: Using more cores on machines
To: as...@live.com
CC: so...@cloudera.com; user@spark.apache.org
AFAIK, `--num-executors` is not available for standalone clusters. In
standalone mode, you must start new workers on your node