Github user MaxGekk commented on the issue: https://github.com/apache/spark/pull/21589 > in this cluster do we really mean cores allocated to the "application" or "job"? @felixcheung What about `number of CPUs/Executors potentially available to an job submitted via the Spark Context`?
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org