On Thu, Apr 16, 2015 at 12:16:13PM -0700, Marcelo Vanzin wrote:
I think Michael is referring to this:
Exception in thread main java.lang.IllegalArgumentException: You
must specify at least 1 executor!
Usage: org.apache.spark.deploy.yarn.Client [options]
Yes, sorry, there were too many mins
I think Michael is referring to this:
Exception in thread main java.lang.IllegalArgumentException: You
must specify at least 1 executor!
Usage: org.apache.spark.deploy.yarn.Client [options]
spark-submit --conf spark.dynamicAllocation.enabled=true --conf
spark.dynamicAllocation.minExecutors=0
On Thu, Apr 16, 2015 at 07:47:51PM +0100, Sean Owen wrote:
IIRC that was fixed already in 1.3
https://github.com/apache/spark/commit/b2047b55c5fc85de6b63276d8ab9610d2496e08b
From that commit:
+ private val minNumExecutors =
conf.getInt(spark.dynamicAllocation.minExecutors, 0
On Thu, Apr 16, 2015 at 08:10:54PM +0100, Sean Owen wrote:
Yes, look what it was before -- would also reject a minimum of 0.
That's the case you are hitting. 0 is a fine minimum.
How can 0 be a fine minimum if it's rejected? Changing the value is easy
enough, but in general it's nice for
://github.com/apache/spark/commit/b2047b55c5fc85de6b63276d8ab9610d2496e08b
From that commit:
+ private val minNumExecutors =
conf.getInt(spark.dynamicAllocation.minExecutors, 0)
...
+ if (maxNumExecutors == 0) {
+ throw new SparkException(spark.dynamicAllocation.maxExecutors cannot be
0
Looks like that message would be triggered if
spark.dynamicAllocation.initialExecutors was not set, or 0, if I read
this right. Yeah, that might have to be positive. This requires you
set initial executors to 1 if you want 0 min executors. Hm, maybe that
shouldn't be an error condition in the args
The default for spark.dynamicAllocation.minExecutors is 0, but that
value causes a runtime error and a message that the minimum is 1.
Perhaps the default should be changed to 1?
Mike Stone
-
To unsubscribe, e-mail: user
IIRC that was fixed already in 1.3
https://github.com/apache/spark/commit/b2047b55c5fc85de6b63276d8ab9610d2496e08b
On Thu, Apr 16, 2015 at 7:41 PM, Michael Stone mst...@mathom.us wrote:
The default for spark.dynamicAllocation.minExecutors is 0, but that value
causes a runtime error