Repository: spark
Updated Branches:
  refs/heads/master 870b9d9aa -> 8feb799af


[SPARK-20197][SPARKR] CRAN check fail with package installation

## What changes were proposed in this pull request?

Test failed because SPARK_HOME is not set before Spark is installed.

Author: Felix Cheung <felixcheun...@hotmail.com>

Closes #17516 from felixcheung/rdircheckincran.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/8feb799a
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/8feb799a
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/8feb799a

Branch: refs/heads/master
Commit: 8feb799af0bb67618310947342e3e4d2a77aae13
Parents: 870b9d9
Author: Felix Cheung <felixcheun...@hotmail.com>
Authored: Fri Apr 7 11:17:49 2017 -0700
Committer: Felix Cheung <felixche...@apache.org>
Committed: Fri Apr 7 11:17:49 2017 -0700

----------------------------------------------------------------------
 R/pkg/tests/run-all.R | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/8feb799a/R/pkg/tests/run-all.R
----------------------------------------------------------------------
diff --git a/R/pkg/tests/run-all.R b/R/pkg/tests/run-all.R
index cefaadd..29812f8 100644
--- a/R/pkg/tests/run-all.R
+++ b/R/pkg/tests/run-all.R
@@ -22,12 +22,13 @@ library(SparkR)
 options("warn" = 2)
 
 # Setup global test environment
+# Install Spark first to set SPARK_HOME
+install.spark()
+
 sparkRDir <- file.path(Sys.getenv("SPARK_HOME"), "R")
 sparkRFilesBefore <- list.files(path = sparkRDir, all.files = TRUE)
 sparkRWhitelistSQLDirs <- c("spark-warehouse", "metastore_db")
 invisible(lapply(sparkRWhitelistSQLDirs,
                  function(x) { unlink(file.path(sparkRDir, x), recursive = 
TRUE, force = TRUE)}))
 
-install.spark()
-
 test_package("SparkR")


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to