Yes, look what it was before -- would also reject a minimum of 0.
That's the case you are hitting. 0 is a fine minimum.

On Thu, Apr 16, 2015 at 8:09 PM, Michael Stone <[email protected]> wrote:
> On Thu, Apr 16, 2015 at 07:47:51PM +0100, Sean Owen wrote:
>>
>> IIRC that was fixed already in 1.3
>>
>>
>> https://github.com/apache/spark/commit/b2047b55c5fc85de6b63276d8ab9610d2496e08b
>
>
> From that commit:
>
> + private val minNumExecutors =
> conf.getInt("spark.dynamicAllocation.minExecutors", 0)
> ...
> + if (maxNumExecutors == 0) {
> + throw new SparkException("spark.dynamicAllocation.maxExecutors cannot be
> 0!")

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to