Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21533#discussion_r194812492
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
    @@ -1517,9 +1517,12 @@ class SparkContext(config: SparkConf) extends 
Logging {
        * only supported for Hadoop-supported filesystems.
        */
       def addFile(path: String, recursive: Boolean): Unit = {
    -    val uri = new Path(path).toUri
    +    var uri = new Path(path).toUri
         val schemeCorrectedPath = uri.getScheme match {
    -      case null | "local" => new File(path).getCanonicalFile.toURI.toString
    +      case null | "local" =>
    +        // SPARK-24195: Local is not a valid scheme for FileSystem, we 
should only keep path here.
    +        uri = new Path(uri.getPath).toUri
    --- End diff --
    
    Why is this needed? Can't we just do `new 
File(uri.getPath).getCanonicalFile.toURI.toString` without this line?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to