Repository: spark
Updated Branches:
  refs/heads/branch-2.0 b25a8e6e1 -> f14f47f07


Skip building R vignettes if Spark is not built

## What changes were proposed in this pull request?

When we build the docs separately we don't have the JAR files from the Spark 
build in
the same tree. As the SparkR vignettes need to launch a SparkContext to be 
built, we skip building them if JAR files don't exist

## How was this patch tested?

To test this we can run the following:
```
build/mvn -DskipTests -Psparkr clean
./R/create-docs.sh
```
You should see a line `Skipping R vignettes as Spark JARs not found` at the end

Author: Shivaram Venkataraman <shiva...@cs.berkeley.edu>

Closes #15200 from shivaram/sparkr-vignette-skip.

(cherry picked from commit 9f24a17c59b1130d97efa7d313c06577f7344338)
Signed-off-by: Reynold Xin <r...@databricks.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/f14f47f0
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/f14f47f0
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/f14f47f0

Branch: refs/heads/branch-2.0
Commit: f14f47f072a392df0ebe908f1c57b6eb858105b7
Parents: b25a8e6
Author: Shivaram Venkataraman <shiva...@cs.berkeley.edu>
Authored: Thu Sep 22 11:52:42 2016 -0700
Committer: Reynold Xin <r...@databricks.com>
Committed: Thu Sep 22 11:54:51 2016 -0700

----------------------------------------------------------------------
 R/create-docs.sh | 25 ++++++++++++++++++++++---
 1 file changed, 22 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/f14f47f0/R/create-docs.sh
----------------------------------------------------------------------
diff --git a/R/create-docs.sh b/R/create-docs.sh
index 0dfba22..69ffc5f 100755
--- a/R/create-docs.sh
+++ b/R/create-docs.sh
@@ -30,6 +30,13 @@ set -e
 
 # Figure out where the script is
 export FWDIR="$(cd "`dirname "$0"`"; pwd)"
+export SPARK_HOME="$(cd "`dirname "$0"`"/..; pwd)"
+
+# Required for setting SPARK_SCALA_VERSION
+. "${SPARK_HOME}"/bin/load-spark-env.sh
+
+echo "Using Scala $SPARK_SCALA_VERSION"
+
 pushd $FWDIR
 
 # Install the package (this will also generate the Rd files)
@@ -45,9 +52,21 @@ Rscript -e 'libDir <- "../../lib"; library(SparkR, 
lib.loc=libDir); library(knit
 
 popd
 
-# render creates SparkR vignettes
-Rscript -e 'library(rmarkdown); paths <- .libPaths(); .libPaths(c("lib", 
paths)); Sys.setenv(SPARK_HOME=tools::file_path_as_absolute("..")); 
render("pkg/vignettes/sparkr-vignettes.Rmd"); .libPaths(paths)'
+# Find Spark jars.
+if [ -f "${SPARK_HOME}/RELEASE" ]; then
+  SPARK_JARS_DIR="${SPARK_HOME}/jars"
+else
+  
SPARK_JARS_DIR="${SPARK_HOME}/assembly/target/scala-$SPARK_SCALA_VERSION/jars"
+fi
+
+# Only create vignettes if Spark JARs exist
+if [ -d "$SPARK_JARS_DIR" ]; then
+  # render creates SparkR vignettes
+  Rscript -e 'library(rmarkdown); paths <- .libPaths(); .libPaths(c("lib", 
paths)); Sys.setenv(SPARK_HOME=tools::file_path_as_absolute("..")); 
render("pkg/vignettes/sparkr-vignettes.Rmd"); .libPaths(paths)'
 
-find pkg/vignettes/. -not -name '.' -not -name '*.Rmd' -not -name '*.md' -not 
-name '*.pdf' -not -name '*.html' -delete
+  find pkg/vignettes/. -not -name '.' -not -name '*.Rmd' -not -name '*.md' 
-not -name '*.pdf' -not -name '*.html' -delete
+else
+  echo "Skipping R vignettes as Spark JARs not found in $SPARK_HOME"
+fi
 
 popd


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to