Using more cores on machines

2014-12-22 Thread Ashic Mahtab
Hi, Say we have 4 nodes with 2 cores each in stand alone mode. I'd like to dedicate 4 cores to a streaming application. I can do this via spark submit by: spark-submit --total-executor-cores 4 However, this assigns one core per machine. I would like to use 2 cores on 2 machines instead,

Re: Using more cores on machines

2014-12-22 Thread Sean Owen
I think you want: --num-executors 2 --executor-cores 2 On Mon, Dec 22, 2014 at 10:39 AM, Ashic Mahtab as...@live.com wrote: Hi, Say we have 4 nodes with 2 cores each in stand alone mode. I'd like to dedicate 4 cores to a streaming application. I can do this via spark submit by:

RE: Using more cores on machines

2014-12-22 Thread Ashic Mahtab
Hi Sean, Thanks for the response. It seems --num-executors is ignored. Specifying --num-executors 2 --executor-cores 2 is giving the app all 8 cores across 4 machines. -Ashic. From: so...@cloudera.com Date: Mon, 22 Dec 2014 10:57:31 + Subject: Re: Using more cores on machines

RE: Using more cores on machines

2014-12-22 Thread Ashic Mahtab
machine on 2 machines (so 4 cores in total) while not using the other two machines. Regards, Ashic. From: j...@soundcloud.com Date: Mon, 22 Dec 2014 17:36:26 +0100 Subject: Re: Using more cores on machines To: as...@live.com CC: so...@cloudera.com; user@spark.apache.org AFAIK, `--num

Re: Using more cores on machines

2014-12-22 Thread Boromir Widas
...@soundcloud.com Date: Mon, 22 Dec 2014 17:36:26 +0100 Subject: Re: Using more cores on machines To: as...@live.com CC: so...@cloudera.com; user@spark.apache.org AFAIK, `--num-executors` is not available for standalone clusters. In standalone mode, you must start new workers on your node