Github user tgravescs commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14149#discussion_r70628081
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -2342,6 +2342,18 @@ private[spark] object Utils extends Logging {
        * Return the initial number of executors for dynamic allocation.
        */
       def getDynamicAllocationInitialExecutors(conf: SparkConf): Int = {
    +    if (conf.get(DYN_ALLOCATION_INITIAL_EXECUTORS) < 
conf.get(DYN_ALLOCATION_MIN_EXECUTORS)) {
    +      logWarning(s"${DYN_ALLOCATION_INITIAL_EXECUTORS.key} less than " +
    +        s"${DYN_ALLOCATION_MIN_EXECUTORS.key} is invalid, will use " +
    +          s"${DYN_ALLOCATION_MIN_EXECUTORS.key} to override.")
    --- End diff --
    
    Right but as a user I could interpret that to mean its going to use the min 
executors for the initial number, which isn't necessarily true.  I'm just 
worried it will confuse the users.    How about we change the phrasing to be 
something like: initial less than min is invalid, ignoring its setting, please 
update your configs.   Similar message for the num executor instances below .
    
    Then after the max calculation we could simply print an info message like: 
using initial executors = value, max of initial, num executors, min executors.
    
    These warnings are turning into a lot of extra code/text.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to