Github user kiszk commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22001#discussion_r208945843
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
    @@ -1602,6 +1602,15 @@ class SparkContext(config: SparkConf) extends 
Logging {
         }
       }
     
    +  /**
    +   * Get the max number of tasks that can be concurrent launched currently.
    --- End diff --
    
    How about like this?
    ```
     * Get the max number of tasks that can be concurrently launched when the 
method is called.
     * Note that please don't cache the value returned by this method, because 
the number can be
     * changed due to adding/removing executors. 
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to