[ 
https://issues.apache.org/jira/browse/SPARK-15801?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15318447#comment-15318447
 ] 

Jonathan Taws commented on SPARK-15801:
---------------------------------------

Indeed, I am getting the same behavior. After quickly sifting through the code, 
it looks like the num-executors option isn't taken into account for the 
standalone mode, based on the 
{{[allocateWorkerResourceToExecutors|https://github.com/apache/spark/blob/d5911d1173fe0872f21cae6c47abf8ff479345a4/core/src/main/scala/org/apache/spark/deploy/master/Master.scala#L673]}}
 method. 

> spark-submit --num-executors switch also works without YARN
> -----------------------------------------------------------
>
>                 Key: SPARK-15801
>                 URL: https://issues.apache.org/jira/browse/SPARK-15801
>             Project: Spark
>          Issue Type: Documentation
>          Components: Spark Submit
>    Affects Versions: 1.6.1
>            Reporter: Jonathan Taws
>            Priority: Minor
>
> Based on this [issue|https://issues.apache.org/jira/browse/SPARK-15781] 
> regarding the SPARK_WORKER_INSTANCES property, I also found that the 
> {{--num-executors}} switch documented in the spark-submit help is partially 
> incorrect. 
> Here's one part of the output (produced by {{spark-submit --help}}): 
> {code}
> YARN-only:
>   --driver-cores NUM          Number of cores used by the driver, only in 
> cluster mode
>                               (Default: 1).
>   --queue QUEUE_NAME          The YARN queue to submit to (Default: 
> "default").
>   --num-executors NUM         Number of executors to launch (Default: 2).
> {code}
> Correct me if I am wrong, but the num-executors switch also works in Spark 
> standalone mode *without YARN*.
> I tried by only launching a master and a worker with 4 executors specified, 
> and they were all successfully spawned. The master switch pointed to the 
> master's url, and not to the yarn value. 
> Here's the exact command : {{spark-submit --master spark://[local 
> machine]:7077 --num-executors 4 --executor-cores 2}}
> By default it is *1* executor per worker in Spark standalone mode without 
> YARN, but this option enables to specify the number of executors (per worker 
> ?) if, and only if, the {{--executor-cores}} switch is also set. I do believe 
> it defaults to 2 in YARN mode. 
> I would propose to move the option from the *YARN-only* section to the *Spark 
> standalone and YARN only* section.  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to