[ https://issues.apache.org/jira/browse/SPARK-38910?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17522638#comment-17522638 ]
Apache Spark commented on SPARK-38910: -------------------------------------- User 'AngersZhuuuu' has created a pull request for this issue: https://github.com/apache/spark/pull/36207 > Clean sparkStaging dir when WAIT_FOR_APP_COMPLETION is false too > ---------------------------------------------------------------- > > Key: SPARK-38910 > URL: https://issues.apache.org/jira/browse/SPARK-38910 > Project: Spark > Issue Type: Task > Components: YARN > Affects Versions: 3.2.1, 3.3.0 > Reporter: angerszhu > Priority: Major > > {code:java} > def run(): Unit = { > submitApplication() > if (!launcherBackend.isConnected() && fireAndForget) { > val report = getApplicationReport(appId) > val state = report.getYarnApplicationState > logInfo(s"Application report for $appId (state: $state)") > logInfo(formatReportDetails(report, getDriverLogsLink(report))) > if (state == YarnApplicationState.FAILED || state == > YarnApplicationState.KILLED) { > throw new SparkException(s"Application $appId finished with status: > $state") > } > } else { > val YarnAppReport(appState, finalState, diags) = > monitorApplication(appId) > if (appState == YarnApplicationState.FAILED || finalState == > FinalApplicationStatus.FAILED) { > var amContainerSucceed = false > val amContainerExitMsg = s"AM Container for " + > > s"${yarnClient.getApplicationReport(appId).getCurrentApplicationAttemptId} " + > s"exited with exitCode: 0" > diags.foreach { err => > logError(s"Application diagnostics message: $err") > if (err.contains(amContainerExitMsg)) { > amContainerSucceed = true > > {code} > Not clean the staging dir when match case > {code:jave} > !launcherBackend.isConnected() && fireAndForget > {code} -- This message was sent by Atlassian Jira (v8.20.1#820001) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org