[ 
https://issues.apache.org/jira/browse/SPARK-18810?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15736230#comment-15736230
 ] 

Felix Cheung commented on SPARK-18810:
--------------------------------------

Also to expand on the earlier note above, I think the main thing to be able to 
run existing tests, build vignettes and so on
- without having to change any code
or
- without having to manually call install.spark in a separate session first to 
cache the spark jar

this is why I think it makes sense to have an environment override instead of 
an API parameter switch.


> SparkR install.spark does not work for RCs, snapshots
> -----------------------------------------------------
>
>                 Key: SPARK-18810
>                 URL: https://issues.apache.org/jira/browse/SPARK-18810
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR
>    Affects Versions: 2.0.2, 2.1.0
>            Reporter: Shivaram Venkataraman
>
> We publish source archives of the SparkR package now in RCs and in nightly 
> snapshot builds. One of the problems that still remains is that 
> `install.spark` does not work for these as it looks for the final Spark 
> version to be present in the apache download mirrors.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to