[ 
https://issues.apache.org/jira/browse/SPARK-39735?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yikun Jiang updated SPARK-39735:
--------------------------------
    Description: 
Since sparkr 4.2.x has [below new 
change](https://github.com/r-devel/r-svn/blob/e6be1f6b14838016e78d6a91f48f21acec7fa4c4/doc/NEWS.Rd#L376):
  > Environment variables R_LIBS_USER and R_LIBS_SITE are both now set to the R 
system default if unset or empty, and can be set to NULL to indicate an empty 
list of user or site library directories.

lastest ubuntu pkg also sync this change and has some changes on 
/etc/R/Renviron ():
```
$ docker run -ti 
ghcr.io/yikun/apache-spark-github-action-image:sparkr-master-2569799176 cat 
/etc/R/Renviron | grep R_LIBS_SITE

R_LIBS_SITE=${R_LIBS_SITE:-'%S'}

$ docker run -ti 
ghcr.io/yikun/apache-spark-github-action-image:sparkr-master-2569799176 cat 
/etc/R/Renviron.site | grep R_LIBS_SITE

## edd Jul 2007  Now use R_LIBS_SITE, not R_LIBS
## edd Mar 2022  Now in Renviron.site reflecting R_LIBS_SITE
R_LIBS_SITE="/usr/local/lib/R/site-library/:${R_LIBS_SITE}:/usr/lib/R/library"
```

So, we add `R_LIBS_SITE` to ENV from `/etc/R/Renviron.site` to make sure search 
paths right for sparkr.

otherwise, even if we install the `lintr` will cause error like due to 
`R_LIBS_SITE` wrong set:
```
$ dev/lint-r
Loading required namespace: SparkR
Loading required namespace: lintr
Failed with error:  'there is no package called 'lintr''
Installing package into '/usr/lib/R/site-library'
(as 'lib' is unspecified)
Error in contrib.url(repos, type) :
  trying to use CRAN without setting a mirror
Calls: install.packages -> startsWith -> contrib.url
Execution halted
```

[1] https://cran.r-project.org/doc/manuals/r-devel/NEWS.html
[2] https://stat.ethz.ch/R-manual/R-devel/library/base/html/libPaths.html

> Enable base image build in lint job
> -----------------------------------
>
>                 Key: SPARK-39735
>                 URL: https://issues.apache.org/jira/browse/SPARK-39735
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Project Infra
>    Affects Versions: 3.4.0
>            Reporter: Yikun Jiang
>            Priority: Major
>
> Since sparkr 4.2.x has [below new 
> change](https://github.com/r-devel/r-svn/blob/e6be1f6b14838016e78d6a91f48f21acec7fa4c4/doc/NEWS.Rd#L376):
>   > Environment variables R_LIBS_USER and R_LIBS_SITE are both now set to the 
> R system default if unset or empty, and can be set to NULL to indicate an 
> empty list of user or site library directories.
> lastest ubuntu pkg also sync this change and has some changes on 
> /etc/R/Renviron ():
> ```
> $ docker run -ti 
> ghcr.io/yikun/apache-spark-github-action-image:sparkr-master-2569799176 cat 
> /etc/R/Renviron | grep R_LIBS_SITE
> R_LIBS_SITE=${R_LIBS_SITE:-'%S'}
> $ docker run -ti 
> ghcr.io/yikun/apache-spark-github-action-image:sparkr-master-2569799176 cat 
> /etc/R/Renviron.site | grep R_LIBS_SITE
> ## edd Jul 2007  Now use R_LIBS_SITE, not R_LIBS
> ## edd Mar 2022  Now in Renviron.site reflecting R_LIBS_SITE
> R_LIBS_SITE="/usr/local/lib/R/site-library/:${R_LIBS_SITE}:/usr/lib/R/library"
> ```
> So, we add `R_LIBS_SITE` to ENV from `/etc/R/Renviron.site` to make sure 
> search paths right for sparkr.
> otherwise, even if we install the `lintr` will cause error like due to 
> `R_LIBS_SITE` wrong set:
> ```
> $ dev/lint-r
> Loading required namespace: SparkR
> Loading required namespace: lintr
> Failed with error:  'there is no package called 'lintr''
> Installing package into '/usr/lib/R/site-library'
> (as 'lib' is unspecified)
> Error in contrib.url(repos, type) :
>   trying to use CRAN without setting a mirror
> Calls: install.packages -> startsWith -> contrib.url
> Execution halted
> ```
> [1] https://cran.r-project.org/doc/manuals/r-devel/NEWS.html
> [2] https://stat.ethz.ch/R-manual/R-devel/library/base/html/libPaths.html



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to