[ 
https://issues.apache.org/jira/browse/SPARK-10582?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

KaiXinXIaoLei updated SPARK-10582:
----------------------------------
    Description: During running tasks, when the total number of executors is 
the value of spark.dynamicAllocation.maxExecutors and the AM is failed. Then a 
new AM restarts. Because in ExecutorAllocationManager, the total number of 
executors does not changed, driver does not send RequestExecutors to AM to ask 
executors. Then the total number of executors is the value of 
spark.dynamicAllocation.initialExecutors . So the total number of executors in 
driver and AM is different.  (was: using spark-dynamic-executor-allocation, if 
AM failed during running task, the new AM will be started. But the new AM does 
not allocate executors for driver.)

> using dynamic-executor-allocation, if AM failed. the new AM will be started. 
> But the new AM does not allocate executors to dirver
> ---------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-10582
>                 URL: https://issues.apache.org/jira/browse/SPARK-10582
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.4.1
>            Reporter: KaiXinXIaoLei
>
> During running tasks, when the total number of executors is the value of 
> spark.dynamicAllocation.maxExecutors and the AM is failed. Then a new AM 
> restarts. Because in ExecutorAllocationManager, the total number of executors 
> does not changed, driver does not send RequestExecutors to AM to ask 
> executors. Then the total number of executors is the value of 
> spark.dynamicAllocation.initialExecutors . So the total number of executors 
> in driver and AM is different.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to