Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/14960#discussion_r77543535 --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala --- @@ -1900,7 +1900,20 @@ private[spark] object Utils extends Logging { */ def resolveURI(path: String): URI = { try { - val uri = new URI(path) + val osSafePath = if (Path.isWindowsAbsolutePath(path, false)) { + // Make sure C:/ part becomes /C/. + val windowsUri = new URI(path) + val driveLetter = windowsUri.getScheme + s"/$driveLetter/${windowsUri.getSchemeSpecificPart()}" + } else if (Path.isWindowsAbsolutePath(path, true)) { + // Make sure /C:/ part becomes /C/. + val windowsUri = new URI(path.substring(1)) + val driveLetter = windowsUri.getScheme + s"/$driveLetter/${windowsUri.getSchemeSpecificPart()}" --- End diff -- When we pass in a path from R to Scala we first call `normalizePath` and then pass the output of that to JVM [1, 2]. So the strings in R look something like ``` > d <- normalizePath("./README.md") > d [1] "C:\\Users\\shivaram\\Downloads\\spark-1-sparkr-win-tests-fix\\spark-1-sparkr-win-tests-fix\\README.md" ``` Now there is an option in `normalizePath` to use the `/` instead for Windows alone. So then the output looks something like ``` > d <- normalizePath("./README.md", winslash="/") > d [1] "C:/Users/shivaram/Downloads/spark-1-sparkr-win-tests-fix/spark-1-sparkr-win-tests-fix/README.md" ``` If the second one works better with `resolveURI` we can just change the R code to pass in this option [1] https://github.com/apache/spark/blob/6d86403d8b252776effcddd71338b4d21a224f9b/R/pkg/R/context.R#L75 [2] https://github.com/apache/spark/blob/6d86403d8b252776effcddd71338b4d21a224f9b/R/pkg/R/mllib.R#L905
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org