[ 
https://issues.apache.org/jira/browse/SPARK-17428?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15471040#comment-15471040
 ] 

Shivaram Venkataraman commented on SPARK-17428:
-----------------------------------------------

Yeah so it should be relatively easy to install any R package from CRAN / a set 
of repos to a specified directory. The `lib` option at 
https://stat.ethz.ch/R-manual/R-devel/library/utils/html/install.packages.html 
can be used for this.

So one way to do this would be to take in the names of R packages and / or 
tar.gz files and invoke `install.packages` with the appropriate YARN local dir 
or Mesos local dir passed in as `lib` 

I think [~sunrui] has a good point about compiling packages at one machine vs. 
many machines. I think compiling only on driver will save some work  -- Just as 
a point of reference how do we handle source packages in PySpark ?

> SparkR executors/workers support virtualenv
> -------------------------------------------
>
>                 Key: SPARK-17428
>                 URL: https://issues.apache.org/jira/browse/SPARK-17428
>             Project: Spark
>          Issue Type: New Feature
>          Components: SparkR
>            Reporter: Yanbo Liang
>
> Many users have requirements to use third party R packages in 
> executors/workers, but SparkR can not satisfy this requirements elegantly. 
> For example, you should to mess with the IT/administrators of the cluster to 
> deploy these R packages on each executors/workers node which is very 
> inflexible.
> I think we should support third party R packages for SparkR users as what we 
> do for jar packages in the following two scenarios:
> 1, Users can install R packages from CRAN or custom CRAN-like repository for 
> each executors.
> 2, Users can load their local R packages and install them on each executors.
> To achieve this goal, the first thing is to make SparkR executors support 
> virtualenv like Python conda. I have investigated and found 
> packrat(http://rstudio.github.io/packrat/) is one of the candidates to 
> support virtualenv for R. Packrat is a dependency management system for R and 
> can isolate the dependent R packages in its own private package space. Then 
> SparkR users can install third party packages in the application 
> scope(destroy after the application exit) and don’t need to bother 
> IT/administrators to install these packages manually.
> I would like to know whether it make sense.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to