GitHub user junyangq opened a pull request: https://github.com/apache/spark/pull/14448
[Spark-16579][SparkR] Add install.spark function ## What changes were proposed in this pull request? Add an `install.spark` function to the SparkR package. User can run `install.spark()` to install Spark to a local directory within R if not existing one found. It searches for installation files in three ways, in the following order. 1. user provided mirror site in `mirrorUrl` 2. mirror site suggested from apache website 3. hardcoded backup option ## How was this patch tested? Manual tests. (If this patch involves UI changes, please attach a screenshot; otherwise, remove this) You can merge this pull request into a Git repository by running: $ git pull https://github.com/junyangq/spark SPARK-16579-2.0 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/14448.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #14448 ---- commit 0b676314a13a8a796ee45baf99f4bc6d936d01d5 Author: Junyang Qian <junya...@databricks.com> Date: 2016-07-29T22:24:07Z Add install.spark function to SparkR Users can download and install Spark package inside R console ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org