[ 
https://issues.apache.org/jira/browse/SPARK-11555?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14993742#comment-14993742
 ] 

Sean Owen commented on SPARK-11555:
-----------------------------------

num-executors? I'm sure we've just recently adjusted or fixed this. Are you 
sure you're on the latest?

This case appears to be handled already in {{ClientArguments.scala}}:

{code}
    if (numExecutors < 0 || (!isDynamicAllocationEnabled && numExecutors == 0)) 
{
      throw new IllegalArgumentException(
        s"""
           |Number of executors was $numExecutors, but must be at least 1
           |(or 0 if dynamic executor allocation is enabled).
           |${getUsageMessage()}
         """.stripMargin)
    }
{code}

Do you have # executors configured elsewhere and it's overriding? that is, what 
is the actual effect? you end up with 0 executors or an error or what?

> spark on yarn spark-class --num-workers allows 0 and shouldn't
> --------------------------------------------------------------
>
>                 Key: SPARK-11555
>                 URL: https://issues.apache.org/jira/browse/SPARK-11555
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 1.5.2
>            Reporter: Thomas Graves
>
> Calling into yarn with 0 number or workers shouldn't be allowed unless 
> dynamic allocation is enabled.  I have a test that does backwards 
> compatibility testing using the old spark-class and --num-workers and passing 
> in 0 is now broken.    It allows 0 and shouldn't:
> bin/spark-class org.apache.spark.deploy.yarn.Client --jar 
> lib/spark-examples-1.5.2.0-hadoop2.6.0.16.1506060127.jar --class 
> org.apache.spark.examples.SparkPi --num-workers 0 --worker-memory 2g 
> --master-memory 1g --worker-cores 1 --queue default



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to