[ 
https://issues.apache.org/jira/browse/SPARK-16382?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15365255#comment-15365255
 ] 

Ryan Blue commented on SPARK-16382:
-----------------------------------

[~jerryshao], [~tgraves], I think you're both right that this is currently 
caught. The behavior I observed was in our local copy with an older patch for 
SPARK-13723 that used {{spark.executor.instances}} to increase the min rather 
than the initial number of executors. For those jobs where min was then higher 
than max, Spark would try to get the min number of executors and never let go 
of any and there wasn't a problem that it was higher than max.

I was originally suggesting that max should be increased, which doesn't 
currently happen, but then I thought that it may be better to fail so I added 
that to the description. That's why I missed that Spark already fails. I'll 
close this. Thanks!

> YARN - Dynamic allocation with spark.executor.instances should increase max 
> executors.
> --------------------------------------------------------------------------------------
>
>                 Key: SPARK-16382
>                 URL: https://issues.apache.org/jira/browse/SPARK-16382
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>            Reporter: Ryan Blue
>
> SPARK-13723 changed the behavior of dynamic allocation when 
> {{--num-executors}} ({{spark.executor.instances}}) is set. Rather than 
> turning off dynamic allocation, the value is used as the initial number of 
> executors. This did not change the behavior of 
> {{spark.dynamicAllocation.maxExecutors}}. We've noticed that some users set 
> {{--num-executors}} higher than the max and the expectation is that the max 
> increases.
> I think that either max should be increased, or Spark should fail and 
> complain that the executors requested is higher than the max.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to