Jonathan Taws created SPARK-15801:
-------------------------------------

             Summary: spark-submit --num-executors switch also works without 
YARN
                 Key: SPARK-15801
                 URL: https://issues.apache.org/jira/browse/SPARK-15801
             Project: Spark
          Issue Type: Documentation
          Components: Spark Submit
    Affects Versions: 1.6.1
            Reporter: Jonathan Taws
            Priority: Minor


Based on this [issue|https://issues.apache.org/jira/browse/SPARK-15781] 
regarding the SPARK_WORKER_INSTANCES property, I also found that the 
{{--num-executors}} switch documented in the spark-submit help is partially 
incorrect. 

Here's one part of the output (produced by {{spark-submit --help}}): 
{code}
YARN-only:
  --driver-cores NUM          Number of cores used by the driver, only in 
cluster mode
                              (Default: 1).
  --queue QUEUE_NAME          The YARN queue to submit to (Default: "default").
  --num-executors NUM         Number of executors to launch (Default: 2).
{code}

Correct me if I am wrong, but the num-executors switch also works in Spark 
standalone mode *without YARN*.

I tried by only launching a master and a worker with 4 executors specified, and 
they were all successfully spawned. The master switch pointed to the master's 
url, and not to the yarn value. 

By default it is *1* executor per worker in Spark standalone mode without YARN, 
but this option enables to specify the number of executors (per worker ?) if, 
and only if, the executor-cores switch is also set. I do believe it defaults to 
2 in YARN mode. 

I would propose to move the option from the *YARN-only* section to the *Spark 
standalone and YARN only* section.  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to