Github user jiangxb1987 commented on the issue:

    https://github.com/apache/spark/pull/21589
  
    > @felixcheung I am not sure that our users are so interested in getting a 
list of cores per executors and calculate total numbers cores by summurizing 
the list. It will just complicate API and implementation, from my point of view.
    
    A list of cores per executors can be useful, one scenario is users may want 
to know how many slots are available, and that requires sum all the slots on 
each executors, with # of slots = # of cores on an executor / CPUS_PER_TASK


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to