Github user rxin commented on the issue: https://github.com/apache/spark/pull/19881 I thought about this more, and I actually think something like this makes more sense: `executorAllocationRatio`. Basically it is just a ratio that determines how aggressive we want Spark to request full executors. Ratio of 1.0 means fill up everything. Ratio of 0.5 means only request half of the executors. What do you think?
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org