[ https://issues.apache.org/jira/browse/SPARK-18027?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15601381#comment-15601381 ]
Sean Owen edited comment on SPARK-18027 at 10/24/16 8:53 AM: ------------------------------------------------------------- EDIT: OK I buy into this. https://github.com/apache/spark/pull/15598#discussion_r84643826 was (Author: srowen): Hm, but in this case, whatever it may do, Spark is done with the app, so is there a reason to keep its staging directory? Or: if you believe Spark should keep trying to find the app status, then more logic needs to change, because right now Spark just stop and treats it as failed. > .sparkStaging not clean on RM ApplicationNotFoundException > ---------------------------------------------------------- > > Key: SPARK-18027 > URL: https://issues.apache.org/jira/browse/SPARK-18027 > Project: Spark > Issue Type: Bug > Components: YARN > Affects Versions: 1.6.0 > Reporter: David Shar > Priority: Minor > > Hi, > It seems that SPARK-7705 didn't fix all issues with .sparkStaging folder > cleanup. > in Client.scala:monitorApplication > {code} > val report: ApplicationReport = > try { > getApplicationReport(appId) > } catch { > case e: ApplicationNotFoundException => > logError(s"Application $appId not found.") > return (YarnApplicationState.KILLED, > FinalApplicationStatus.KILLED) > case NonFatal(e) => > logError(s"Failed to contact YARN for application $appId.", e) > return (YarnApplicationState.FAILED, > FinalApplicationStatus.FAILED) > } > .... > if (state == YarnApplicationState.FINISHED || > state == YarnApplicationState.FAILED || > state == YarnApplicationState.KILLED) { > cleanupStagingDir(appId) > return (state, report.getFinalApplicationStatus) > } > {code} > In case of ApplicationNotFoundException, we don't cleanup the sparkStaging > folder. > I believe we should call cleanupStagingDir(appId) on the catch clause above. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org