[ https://issues.apache.org/jira/browse/SPARK-10871?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Andrew Or resolved SPARK-10871. ------------------------------- Resolution: Fixed Assignee: Ryan Williams Fix Version/s: 1.6.0 1.5.2 Target Version/s: 1.5.2, 1.6.0 > Specify number of failed executors in ApplicationMaster error message > --------------------------------------------------------------------- > > Key: SPARK-10871 > URL: https://issues.apache.org/jira/browse/SPARK-10871 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 1.5.1 > Reporter: Ryan Williams > Assignee: Ryan Williams > Priority: Minor > Fix For: 1.5.2, 1.6.0 > > > I ran in to > [this|https://github.com/apache/spark/blob/9b9fe5f7bf55257269d8febcd64e95677075dfb6/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala#L346-L348] > error message today while debugging a failed app: > {code} > 15/09/29 00:33:20 INFO yarn.ApplicationMaster: Final app status: FAILED, > exitCode: 11, (reason: Max number of executor failures reached) > 15/09/29 00:33:23 INFO util.ShutdownHookManager: Shutdown hook called > {code} > This app ran with dynamic allocation and I'm not sure what limit was used as > the "maximum allowable number of failed executors"; in any case, the error > message may as well specify this. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org