IIRC that was fixed already in 1.3

https://github.com/apache/spark/commit/b2047b55c5fc85de6b63276d8ab9610d2496e08b

On Thu, Apr 16, 2015 at 7:41 PM, Michael Stone <[email protected]> wrote:
> The default for spark.dynamicAllocation.minExecutors is 0, but that value
> causes a runtime error and a message that the minimum is 1. Perhaps the
> default should be changed to 1?
>
> Mike Stone
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
>

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to