[ 
https://issues.apache.org/jira/browse/SPARK-12910?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shivaram Venkataraman updated SPARK-12910:
------------------------------------------
    Shepherd:   (was: Shivram Mani)

> Support for specifying version of R to use while creating sparkR libraries
> --------------------------------------------------------------------------
>
>                 Key: SPARK-12910
>                 URL: https://issues.apache.org/jira/browse/SPARK-12910
>             Project: Spark
>          Issue Type: Improvement
>          Components: SparkR
>         Environment: Linux
>            Reporter: Shubhanshu Mishra
>            Priority: Minor
>              Labels: installation, sparkR
>             Fix For: 2.0.0
>
>
> When we use `$SPARK_HOME/R/install-dev.sh` it uses the default system R. 
> However, a user might have locally installed their own version of R. There 
> should be a way to specify which R version to use. 
> I have fixed this in my code using the following patch:
> {code:bash}
> $ git diff HEAD
> diff --git a/R/README.md b/R/README.md
> index 005f56d..99182e5 100644
> --- a/R/README.md
> +++ b/R/README.md
> @@ -1,6 +1,15 @@
>  # R on Spark
>  
>  SparkR is an R package that provides a light-weight frontend to use Spark 
> from R.
> +### Installing sparkR
> +
> +Libraries of sparkR need to be created in `$SPARK_HOME/R/lib`. This can be 
> done by running the script `$SPARK_HOME/R/install-dev.sh`.
> +By default the above script uses the system wide installation of R. However, 
> this can be changed to any user installed location of R by giving the full 
> path of the `$R_HOME` as the first argument to the install-dev.sh script.
> +Example: 
> +```
> +# where /home/username/R is where R is installed and /home/username/R/bin 
> contains the files R and RScript
> +./install-dev.sh /home/username/R 
> +```
>  
>  ### SparkR development
>  
> diff --git a/R/install-dev.sh b/R/install-dev.sh
> index 4972bb9..a8efa86 100755
> --- a/R/install-dev.sh
> +++ b/R/install-dev.sh
> @@ -35,12 +35,19 @@ LIB_DIR="$FWDIR/lib"
>  mkdir -p $LIB_DIR
>  
>  pushd $FWDIR > /dev/null
> +if [ ! -z "$1" ]
> +  then
> +    R_HOME="$1/bin"
> +   else
> +    R_HOME="$(dirname $(which R))"
> +fi
> +echo "USING R_HOME = $R_HOME"
>  
>  # Generate Rd files if devtools is installed
> -Rscript -e ' if("devtools" %in% rownames(installed.packages())) { 
> library(devtools); devtools::document(pkg="./pkg", roclets=c("rd")) }'
> +"$R_HOME/"Rscript -e ' if("devtools" %in% rownames(installed.packages())) { 
> library(devtools); devtools::document(pkg="./pkg", roclets=c("rd")) }'
>  
>  # Install SparkR to $LIB_DIR
> -R CMD INSTALL --library=$LIB_DIR $FWDIR/pkg/
> +"$R_HOME/"R CMD INSTALL --library=$LIB_DIR $FWDIR/pkg/
>  
>  # Zip the SparkR package so that it can be distributed to worker nodes on 
> YARN
>  cd $LIB_DIR
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to