I'm working on packaging 2.0.1 rc but encountered a problem: R doc fails to
build. Can somebody take a look at the issue ASAP?



** knitting documentation of write.parquet
** knitting documentation of write.text
** knitting documentation of year
~/workspace/spark-release-docs/spark/R
~/workspace/spark-release-docs/spark/R


processing file: sparkr-vignettes.Rmd

  |
  |                                                                 |   0%
  |
  |.                                                                |   1%
   inline R code fragments


  |
  |.                                                                |   2%
label: unnamed-chunk-1 (with options)
List of 1
 $ message: logi FALSE

Loading required package: methods

Attaching package: 'SparkR'

The following objects are masked from 'package:stats':

    cov, filter, lag, na.omit, predict, sd, var, window

The following objects are masked from 'package:base':

    as.data.frame, colnames, colnames<-, drop, intersect, rank,
    rbind, sample, subset, summary, transform, union


  |
  |..                                                               |   3%
  ordinary text without R code


  |
  |..                                                               |   4%
label: unnamed-chunk-2 (with options)
List of 1
 $ message: logi FALSE

Spark package found in SPARK_HOME:
/home/jenkins/workspace/spark-release-docs/spark
Error: Could not find or load main class org.apache.spark.launcher.Main
Quitting from lines 30-31 (sparkr-vignettes.Rmd)
Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap,  :
  JVM is not ready after 10 seconds
Calls: render ... eval -> eval -> sparkR.session -> sparkR.sparkContext

Execution halted
jekyll 2.5.3 | Error:  R doc generation failed
Deleting credential directory
/home/jenkins/workspace/spark-release-docs/spark-utils/new-release-scripts/jenkins/jenkins-credentials-IXCkuX6w
Build step 'Execute shell' marked build as failure
[WS-CLEANUP] Deleting project workspace...[WS-CLEANUP] done
Finished: FAILURE

Reply via email to