Hi Sean,
Thanks for the response. 

It seems --num-executors is ignored. Specifying --num-executors 2 
--executor-cores 2 is giving the app all 8 cores across 4 machines.

-Ashic.

> From: so...@cloudera.com
> Date: Mon, 22 Dec 2014 10:57:31 +0000
> Subject: Re: Using more cores on machines
> To: as...@live.com
> CC: user@spark.apache.org
> 
> I think you want:
> 
> --num-executors 2 --executor-cores 2
> 
> On Mon, Dec 22, 2014 at 10:39 AM, Ashic Mahtab <as...@live.com> wrote:
> > Hi,
> > Say we have 4 nodes with 2 cores each in stand alone mode. I'd like to
> > dedicate 4 cores to a streaming application. I can do this via spark submit
> > by:
> >
> > spark-submit .... --total-executor-cores 4
> >
> > However, this assigns one core per machine. I would like to use 2 cores on 2
> > machines instead, leaving the other two machines untouched. Is this
> > possible? Is there a downside to doing this? My thinking is that I should be
> > able to reduce quite a bit of network traffic if all machines are not
> > involved.
> >
> >
> > Thanks,
> > Ashic.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 
                                          

Reply via email to