Repository: spark
Updated Branches:
  refs/heads/branch-1.6 204f3601d -> d5145210b


[SPARK-11974][CORE] Not all the temp dirs had been deleted when the JVM exits

deleting the temp dir like that

```

scala> import scala.collection.mutable
import scala.collection.mutable

scala> val a = mutable.Set(1,2,3,4,7,0,8,98,9)
a: scala.collection.mutable.Set[Int] = Set(0, 9, 1, 2, 3, 7, 4, 8, 98)

scala> a.foreach(x => {a.remove(x) })

scala> a.foreach(println(_))
98
```

You may not modify a collection while traversing or iterating over it.This can 
not delete all element of the collection

Author: Zhongshuai Pei <peizhongsh...@huawei.com>

Closes #9951 from DoingDone9/Bug_RemainDir.

(cherry picked from commit 6b781576a15d8d5c5fbed8bef1c5bda95b3d44ac)
Signed-off-by: Reynold Xin <r...@databricks.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/d5145210
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/d5145210
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/d5145210

Branch: refs/heads/branch-1.6
Commit: d5145210bd59072a33b61f15348d5e794f6df4e0
Parents: 204f360
Author: Zhongshuai Pei <peizhongsh...@huawei.com>
Authored: Wed Nov 25 10:37:34 2015 -0800
Committer: Reynold Xin <r...@databricks.com>
Committed: Wed Nov 25 10:37:42 2015 -0800

----------------------------------------------------------------------
 .../main/scala/org/apache/spark/util/ShutdownHookManager.scala   | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/d5145210/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala
----------------------------------------------------------------------
diff --git 
a/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala 
b/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala
index db4a8b3..4012dca 100644
--- a/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala
+++ b/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala
@@ -57,7 +57,9 @@ private[spark] object ShutdownHookManager extends Logging {
   // Add a shutdown hook to delete the temp dirs when the JVM exits
   addShutdownHook(TEMP_DIR_SHUTDOWN_PRIORITY) { () =>
     logInfo("Shutdown hook called")
-    shutdownDeletePaths.foreach { dirPath =>
+    // we need to materialize the paths to delete because deleteRecursively 
removes items from
+    // shutdownDeletePaths as we are traversing through it.
+    shutdownDeletePaths.toArray.foreach { dirPath =>
       try {
         logInfo("Deleting directory " + dirPath)
         Utils.deleteRecursively(new File(dirPath))


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to