Github user HyukjinKwon commented on the issue: https://github.com/apache/spark/pull/23055 @vanzin and @rdblue, I updated the doc because it sounds not wrong to me. But, for clarification, we shouldn't really document we support something that's not tested (in particular such case above that the failure case was found). Also, IMHO, it's better to make it simple when there's Windows issue in terms of maintenance since not so many Windows maintainers exist in Spark. The main functionality of that configuration is limiting resource if i'm not mistaken. The allocation ones in other modes is secondary if I am not mistaken. Technically someone should test it and fix the doc with showing how it works in another PR.
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org