FWIW it worked for me, but I may not be executing the same thing. I
was running the commands given in R/DOCUMENTATION.md

It succeeded for me in creating the vignette, on branch-2.0.

Maybe it's a version or library issue? what R do you have installed,
and are you up to date with packages like devtools and roxygen2?

On Thu, Sep 22, 2016 at 7:47 AM, Reynold Xin <r...@databricks.com> wrote:
> I'm working on packaging 2.0.1 rc but encountered a problem: R doc fails to
> build. Can somebody take a look at the issue ASAP?
>
>
>
> ** knitting documentation of write.parquet
> ** knitting documentation of write.text
> ** knitting documentation of year
> ~/workspace/spark-release-docs/spark/R
> ~/workspace/spark-release-docs/spark/R
>
>
> processing file: sparkr-vignettes.Rmd
>
>   |
>   |                                                                 |   0%
>   |
>   |.                                                                |   1%
>    inline R code fragments
>
>
>   |
>   |.                                                                |   2%
> label: unnamed-chunk-1 (with options)
> List of 1
>  $ message: logi FALSE
>
> Loading required package: methods
>
> Attaching package: 'SparkR'
>
> The following objects are masked from 'package:stats':
>
>     cov, filter, lag, na.omit, predict, sd, var, window
>
> The following objects are masked from 'package:base':
>
>     as.data.frame, colnames, colnames<-, drop, intersect, rank,
>     rbind, sample, subset, summary, transform, union
>
>
>   |
>   |..                                                               |   3%
>   ordinary text without R code
>
>
>   |
>   |..                                                               |   4%
> label: unnamed-chunk-2 (with options)
> List of 1
>  $ message: logi FALSE
>
> Spark package found in SPARK_HOME:
> /home/jenkins/workspace/spark-release-docs/spark
> Error: Could not find or load main class org.apache.spark.launcher.Main
> Quitting from lines 30-31 (sparkr-vignettes.Rmd)
> Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap,  :
>   JVM is not ready after 10 seconds
> Calls: render ... eval -> eval -> sparkR.session -> sparkR.sparkContext
>
> Execution halted
> jekyll 2.5.3 | Error:  R doc generation failed
> Deleting credential directory
> /home/jenkins/workspace/spark-release-docs/spark-utils/new-release-scripts/jenkins/jenkins-credentials-IXCkuX6w
> Build step 'Execute shell' marked build as failure
> [WS-CLEANUP] Deleting project workspace...[WS-CLEANUP] done
> Finished: FAILURE

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to