--num-executors is the total number of executors. In YARN there is not
quite the same notion of a Spark worker. Of course, one worker has an
executor for each running app, so yes, but you mean for one app? it's
possible, though not usual, to run multiple executors for one app on
one worker. This may be useful if your executor heap size is otherwise
getting huge.

On Thu, Feb 26, 2015 at 1:58 AM, Judy Nash
<judyn...@exchange.microsoft.com> wrote:
> Hello,
>
>
>
> Does spark standalone support running multiple executors in one worker node?
>
>
>
> It seems yarn has the parameter --num-executors  to set number of executors
> to deploy, but I do not find the equivalent parameter in spark standalone.
>
>
>
>
>
> Thanks,
>
> Judy

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to