[ 
https://issues.apache.org/jira/browse/SPARK-6033?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14339683#comment-14339683
 ] 

Apache Spark commented on SPARK-6033:
-------------------------------------

User 'hseagle' has created a pull request for this issue:
https://github.com/apache/spark/pull/4803

> the description abou the "spark.worker.cleanup.enabled" is not matched with 
> the code
> ------------------------------------------------------------------------------------
>
>                 Key: SPARK-6033
>                 URL: https://issues.apache.org/jira/browse/SPARK-6033
>             Project: Spark
>          Issue Type: Documentation
>          Components: Documentation
>    Affects Versions: 1.2.0, 1.2.1
>            Reporter: pengxu
>            Priority: Minor
>
> Some error about the section _Cluster Launch Scripts_ in the 
> http://spark.apache.org/docs/latest/spark-standalone.html
> In the description about the property spark.worker.cleanup.enabled, it states 
> that *all the directory* under the work dir will be removed whether the 
> application is running or not.
> After checking the implementation in the code level, I found that +only the 
> stopped application+ dirs would be removed. So the description in the 
> document is incorrect.
> the code implementation in worker.scala
> {code: title=WorkDirCleanup}
> case WorkDirCleanup =>
>       // Spin up a separate thread (in a future) to do the dir cleanup; don't 
> tie up worker actor
>       val cleanupFuture = concurrent.future {
>         val appDirs = workDir.listFiles()
>         if (appDirs == null) {
>           throw new IOException("ERROR: Failed to list files in " + appDirs)
>         }
>         appDirs.filter { dir =>
>           // the directory is used by an application - check that the 
> application is not running
>           // when cleaning up
>           val appIdFromDir = dir.getName
>           val isAppStillRunning = 
> executors.values.map(_.appId).contains(appIdFromDir)
>           dir.isDirectory && !isAppStillRunning &&
>           !Utils.doesDirectoryContainAnyNewFiles(dir, APP_DATA_RETENTION_SECS)
>         }.foreach { dir => 
>           logInfo(s"Removing directory: ${dir.getPath}")
>           Utils.deleteRecursively(dir)
>         }
>       }
>       cleanupFuture onFailure {
>         case e: Throwable =>
>           logError("App dir cleanup failed: " + e.getMessage, e)
>       }
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to