kecookier commented on code in PR #5075:
URL: https://github.com/apache/incubator-gluten/pull/5075#discussion_r1535237040


##########
backends-velox/src/main/scala/org/apache/spark/sql/expression/UDFResolver.scala:
##########
@@ -152,34 +143,39 @@ object UDFResolver extends Logging {
 
   // Get the full paths of all libraries.
   // If it's a directory, get all files ends with ".so" recursively.
-  def getAllLibraries(files: String, sparkConf: SparkConf, 
canAccessSparkFiles: Boolean): String = {
+  def getAllLibraries(files: String, sparkConf: SparkConf): String = {
     val hadoopConf = SparkHadoopUtil.newConfiguration(sparkConf)
     files
       .split(",")
       .map {
         f =>
-          val file = new File(f)
-          // Relative paths should be uploaded via --files or --archives
-          // Use SparkFiles.get to download and unpack
-          if (!file.isAbsolute) {
-            if (!canAccessSparkFiles) {
-              throw new IllegalArgumentException(
-                "On yarn-client mode, driver only accepts absolute paths, but 
got " + f)
+          val uri = Utils.resolveURI(f)

Review Comment:
   Thank you for your patient explanation, Rong! 
   
   I previously thought that under yarn-client mode, `spark.yarn.dist.files` 
would also copy the files to the driver. I got the AM and driver mixed up.  
   
   Let's fix this issue.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@gluten.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@gluten.apache.org
For additional commands, e-mail: commits-h...@gluten.apache.org

Reply via email to