Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19643#discussion_r148710799
  
    --- Diff: R/pkg/R/context.R ---
    @@ -319,6 +319,27 @@ spark.addFile <- function(path, recursive = FALSE) {
       invisible(callJMethod(sc, "addFile", 
suppressWarnings(normalizePath(path)), recursive))
     }
     
    +#' Adds a JAR dependency for Spark tasks to be executed in the future.
    +#'
    +#' The \code{path} passed can be either a local file, a file in HDFS (or 
other Hadoop-supported
    +#' filesystems), an HTTP, HTTPS or FTP URI, or local:/path for a file on 
every worker node.
    +#' If \code{addToCurrentClassLoader} is true, add the jar to the current 
driver.
    --- End diff --
    
    hmm, is this right `add the jar to the current driver.`?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to