Github user jerryshao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19141#discussion_r138262309
  
    --- Diff: 
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala 
---
    @@ -565,7 +565,6 @@ private[spark] class Client(
               distribute(jarsArchive.toURI.getPath,
                 resType = LocalResourceType.ARCHIVE,
                 destName = Some(LOCALIZED_LIB_DIR))
    -          jarsArchive.delete()
    --- End diff --
    
    What if your scenario and SPARK-20741's scenario are both encountered? 
Looks like your approach above cannot be worked.
    
    I'm wondering if we can copy or move this __spark_libs__.zip temp file to 
another non-temp file and add that file to the dist cache. That non-temp file 
will not be deleted and can be overwritten during another launching, so we will 
always have only one copy.
    
    Besides, I think we have several workarounds to handle this issue like 
spark.yarn.jars or spark.yarn.archive, so looks like this corner case is not so 
necessary to fix (just my thinking, normally people will not use local FS in a 
real cluster).
    
    
    



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to