Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20925#discussion_r180193382
  
    --- Diff: 
core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala ---
    @@ -775,17 +781,17 @@ class SparkSubmitSuite
       }
     
       test("SPARK_CONF_DIR overrides spark-defaults.conf") {
    -    forConfDir(Map("spark.executor.memory" -> "2.3g")) { path =>
    +    forConfDir(Map("spark.executor.memory" -> "3g")) { path =>
    --- End diff --
    
    It's just that now instead of just printing an error to the output, an 
exception is actually thrown.
    
    ```
    [info] SparkSubmitSuite:
    [info] - SPARK_CONF_DIR overrides spark-defaults.conf *** FAILED *** (144 
milliseconds)
    [info]   org.apache.spark.SparkException: Executor Memory cores must be a 
positive number
    [info]   at 
org.apache.spark.deploy.SparkSubmitArguments.error(SparkSubmitArguments.scala:652)
    [info]   at 
org.apache.spark.deploy.SparkSubmitArguments.validateSubmitArguments(SparkSubmitArguments.scala:267)
    ```
    
    That is because:
    
    ```
    scala> c.set("spark.abcde", "2.3g")
    res0: org.apache.spark.SparkConf = org.apache.spark.SparkConf@cd5ff55
    
    scala> c.getSizeAsBytes("spark.abcde")
    java.lang.NumberFormatException: Size must be specified as bytes (b), 
kibibytes (k), mebibytes (m), gibibytes (g), tebibytes (t), or pebibytes(p). 
E.g. 50b, 100k, or 250m.
    Fractional values are not supported. Input was: 2.3
    ```
    
    Just noticed that the error message is kinda wrong, but also this whole 
validation function (`validateSubmitArguments`) leaves a lot to be desired...


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to