mridulm commented on code in PR #39226:
URL: https://github.com/apache/spark/pull/39226#discussion_r1058705480


##########
core/src/main/scala/org/apache/spark/status/AppStatusStore.scala:
##########
@@ -733,6 +734,15 @@ private[spark] class AppStatusStore(
 
   def close(): Unit = {
     store.close()
+    cleanUpStorePath()
+  }
+
+  private def cleanUpStorePath(): Unit = {
+    storePath.foreach { p =>
+      if (p.exists()) {
+        p.listFiles().foreach(Utils.deleteRecursively)
+      }
+    }

Review Comment:
   
   There are three separate cases here, though related, let us look at them 
individually.
   
   a) Should we cleanup ?
   
   For cluster mode, it is not very useful - since the cluster manager will do 
the same.
   
   In client mode, like the excellent `spark-shell` example you gave 
@LuciferYang, this is indeed very relevant.
   Given this, let us keep both modes consistent and cleanup for both (like 
what we do for block manager dirs for example).
   
   
   b) Should we need a config to clean up live ui local files.
   
   I would argue always clean up - we dont need config for it. Just as we 
cleanup all other temp paths created during spark lifecycle.
   Is there any case where we do need to control this ? +CC @gengliangwang 
   
   
   c) Issue with rocks db locking.
   
   We should treat this as a bug - couple of ways to fix it, though I would say 
we can simply follow what we already have in spark - for example, use 
`getLocalDir`.
   Is there a reason to expose the ability to configure this path ? Or should 
it simply be an implementation detail, and something spark automatically 
creates during start and remove at stop ?
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to