Repository: spark
Updated Branches:
  refs/heads/branch-1.4 568d1d51d -> 2846a357f


[SPARK-8273] Driver hangs up when yarn shutdown in client mode

In client mode, if yarn was shut down with spark application running, the 
application will hang up after several retries(default: 30) because the 
exception throwed by YarnClientImpl could not be caught by upper level, we 
should exit in case that user can not be aware that.

The exception we wanna catch is 
[here](https://github.com/apache/hadoop/blob/branch-2.7.0/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/retry/RetryInvocationHandler.java#L122),
 and I try to fix it refer to 
[MR](https://github.com/apache/hadoop/blob/branch-2.7.0/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/src/main/java/org/apache/hadoop/mapred/ClientServiceDelegate.java#L320).

Author: WangTaoTheTonic <wangtao...@huawei.com>

Closes #6717 from WangTaoTheTonic/SPARK-8273 and squashes the following commits:

28752d6 [WangTaoTheTonic] catch the throwed exception


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/2846a357
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/2846a357
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/2846a357

Branch: refs/heads/branch-1.4
Commit: 2846a357f32bfa129bc37f4d1cbe9e19caaf69c9
Parents: 568d1d5
Author: WangTaoTheTonic <wangtao...@huawei.com>
Authored: Wed Jun 10 13:34:19 2015 -0700
Committer: Andrew Or <and...@databricks.com>
Committed: Wed Jun 10 13:36:16 2015 -0700

----------------------------------------------------------------------
 yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala | 4 ++++
 1 file changed, 4 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/2846a357/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
----------------------------------------------------------------------
diff --git a/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala 
b/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
index f4d4321..9296e79 100644
--- a/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
+++ b/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
@@ -28,6 +28,7 @@ import scala.collection.JavaConversions._
 import scala.collection.mutable.{ArrayBuffer, HashMap, HashSet, ListBuffer, 
Map}
 import scala.reflect.runtime.universe
 import scala.util.{Try, Success, Failure}
+import scala.util.control.NonFatal
 
 import com.google.common.base.Objects
 import com.google.common.io.Files
@@ -771,6 +772,9 @@ private[spark] class Client(
           case e: ApplicationNotFoundException =>
             logError(s"Application $appId not found.")
             return (YarnApplicationState.KILLED, FinalApplicationStatus.KILLED)
+          case NonFatal(e) =>
+            logError(s"Failed to contact YARN for application $appId.", e)
+            return (YarnApplicationState.FAILED, FinalApplicationStatus.FAILED)
         }
       val state = report.getYarnApplicationState
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to