GitHub user jerryshao opened a pull request: https://github.com/apache/spark/pull/9684
[SPARK-11718][Yarn][Core]Fix explicitly killing executor dies silently issue Currently if dynamic allocation is enabled, explicitly killing executor will not get response, so the executor metadata is wrong in driver side. Which will make dynamic allocation on Yarn fail to work. The problem is `disableExecutor` returns false for pending killing executors when `onDisconnect` is detected, so no further implementation is done. One solution is to bypass these explicitly killed executors to use `super.onDisconnect` to remove executor. This is simple. Another solution is still querying the loss reason for these explicitly kill executors. Since executor may get killed and informed in the same AM-RM communication, so current way of adding pending loss reason request is not worked (container complete is already processed), here we should store this loss reason for later query. Here this PR chooses solution 2. Please help to review. @vanzin I think this part is changed by you previously, would you please help to review? Thanks a lot. You can merge this pull request into a Git repository by running: $ git pull https://github.com/jerryshao/apache-spark SPARK-11718 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/9684.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #9684 ---- commit 8719ee480fe00ecaa85f3f08c8a8bc578a226bcc Author: jerryshao <ss...@hortonworks.com> Date: 2015-11-13T05:47:46Z Fix explicitly killing executor dies silently issue ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org