advancedxy commented on PR #37417:
URL: https://github.com/apache/spark/pull/37417#issuecomment-1517797912

   @pralabhkumar thanks for your work. I noticed similar issue when running 
spark application on K8S, it's helpful feature
   
   However, this pr might have some inefficiency to download files/jars twice 
when running k8s-cluster mode.
   
   ``` java
     if (deployMode == CLIENT) {
         // jars are downloaded once
         localJars = Option(args.jars).map {
           downloadFileList(_, targetDir, sparkConf, hadoopConf)
         }.orNull
         // py files are downloaded once
         localPyFiles = Option(args.pyFiles).map {
           downloadFileList(_, targetDir, sparkConf, hadoopConf)
         }.orNull
         if (isKubernetesClusterModeDriver) {
           def downloadResourcesToCurrentDirectory(uris: String, isArchive: 
Boolean = false): String = {
              ...
           }
   
           val filesLocalFiles = Option(args.files).map {
             downloadResourcesToCurrentDirectory(_)
           }.orNull
           // jars are downloaded again
           val jarsLocalJars = Option(args.jars).map {
             downloadResourcesToCurrentDirectory(_)
           }.orNull
           val archiveLocalFiles = Option(args.archives).map {
             downloadResourcesToCurrentDirectory(_, true)
           }.orNull
           // py files are downloaded again
           val pyLocalFiles = Option(args.pyFiles).map {
             downloadResourcesToCurrentDirectory(_)
           }.orNull
       }
     }
   ```
   
   Would you mind to create a followup pr to address this issue? @pralabhkumar 
   
   Also, there's another catch when running spark on k8s with 
--files/--archives:
   These files/archives are already downloaded here, however they are passed as 
args.files, args.archives, the spark context would copied them (and/or untar 
them) again when constructing the context, see relevant code:
   
https://github.com/apache/spark/blob/d407a42090d7c027050be7ee723f7e3d8f686ed7/core/src/main/scala/org/apache/spark/SparkContext.scala#L440-L443
   And
   
https://github.com/apache/spark/blob/d407a42090d7c027050be7ee723f7e3d8f686ed7/core/src/main/scala/org/apache/spark/SparkContext.scala#L524-L544
   
   cc @Ngone51 @holdenk 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to