IIRC that was fixed already in 1.3

https://github.com/apache/spark/commit/b2047b55c5fc85de6b63276d8ab9610d2496e08b

On Thu, Apr 16, 2015 at 7:41 PM, Michael Stone <mst...@mathom.us> wrote:
> The default for spark.dynamicAllocation.minExecutors is 0, but that value
> causes a runtime error and a message that the minimum is 1. Perhaps the
> default should be changed to 1?
>
> Mike Stone
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to