Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21589#discussion_r201887843
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
    @@ -2336,6 +2336,18 @@ class SparkContext(config: SparkConf) extends 
Logging {
        */
       def defaultMinPartitions: Int = math.min(defaultParallelism, 2)
     
    +  /**
    +   * Total number of CPU cores of all executors registered in the cluster 
at the moment.
    +   * The number reflects current status of the cluster and can change in 
the future.
    +   */
    --- End diff --
    
    Let's at least leave a `@note` that this feature is experimental.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to