Right - I think we should move on with 2.4.0.

In terms of what can be done to avoid this error there are two strategies
- Felix had this other thread about JDK 11 that should at least let
Spark run on the CRAN instance. In general this strategy isn't
foolproof because the JDK version and other dependencies on that
machine keep changing over time and we dont have much control over it.
Worse we also dont have much control
- The other solution is to not run code to build the vignettes
document and just have static code blocks there that have been
pre-evaluated / pre-populated. We can open a JIRA to discuss the
pros/cons of this  ?

Thanks
Shivaram

On Tue, Nov 6, 2018 at 10:57 AM Felix Cheung <felixcheun...@hotmail.com> wrote:
>
> We have not been able to publish to CRAN for quite some time (since 2.3.0 was 
> archived - the cause is Java 11)
>
> I think it’s ok to announce the release of 2.4.0
>
>
> ________________________________
> From: Wenchen Fan <cloud0...@gmail.com>
> Sent: Tuesday, November 6, 2018 8:51 AM
> To: Felix Cheung
> Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> Do you mean we should have a 2.4.0 release without CRAN and then do a 2.4.1 
> immediately?
>
> On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung <felixcheun...@hotmail.com> 
> wrote:
>>
>> Shivaram and I were discussing.
>> Actually we worked with them before. Another possible approach is to remove 
>> the vignettes eval and all test from the source package... in the next 
>> release.
>>
>>
>> ________________________________
>> From: Matei Zaharia <matei.zaha...@gmail.com>
>> Sent: Tuesday, November 6, 2018 12:07 AM
>> To: Felix Cheung
>> Cc: Sean Owen; dev; Shivaram Venkataraman
>> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>>
>> Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we 
>> aren’t disabling it correctly, or perhaps they can ignore this specific 
>> failure. +Shivaram who might have some ideas.
>>
>> Matei
>>
>> > On Nov 5, 2018, at 9:09 PM, Felix Cheung <felixcheun...@hotmail.com> wrote:
>> >
>> > I don¡Št know what the cause is yet.
>> >
>> > The test should be skipped because of this check
>> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
>> >
>> > And this
>> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
>> >
>> > But it ran:
>> > callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", 
>> > "fit", formula,
>> >
>> > The earlier release was achived because of Java 11+ too so this 
>> > unfortunately isn¡Št new.
>> >
>> >
>> > From: Sean Owen <sro...@gmail.com>
>> > Sent: Monday, November 5, 2018 7:22 PM
>> > To: Felix Cheung
>> > Cc: dev
>> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>> >
>> > What can we do to get the release through? is there any way to
>> > circumvent these tests or otherwise hack it? or does it need a
>> > maintenance release?
>> > On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <felixcheun...@hotmail.com> 
>> > wrote:
>> > >
>> > > FYI. SparkR submission failed. It seems to detect Java 11 correctly with 
>> > > vignettes but not skipping tests as would be expected.
>> > >
>> > > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with 
>> > > diagnostics:
>> > > Java version 8 is required for this package; found version: 11.0.1
>> > > Execution halted
>> > >
>> > > * checking PDF version of manual ... OK
>> > > * DONE
>> > > Status: 1 WARNING, 1 NOTE
>> > >
>> > > Current CRAN status: ERROR: 1, OK: 1
>> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>> > >
>> > > Version: 2.3.0
>> > > Check: tests, Result: ERROR
>> > > Running ¡¥run-all.R¡Š [8s/35s]
>> > > Running the tests in ¡¥tests/run-all.R¡Š failed.
>> > > Last 13 lines of output:
>> > > 4: 
>> > > callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", 
>> > > "fit", formula,
>> > > data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), 
>> > > weightCol,
>> > > regParam, as.double(var.power), as.double(link.power), 
>> > > stringIndexerOrderType,
>> > > offsetCol)
>> > > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
>> > > 6: handleErrors(returnStatus, conn)
>> > > 7: stop(readString(conn))
>> > >
>> > > ùùùù testthat results 
>> > > ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
>> > > OK: 0 SKIPPED: 0 FAILED: 2
>> > > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
>> > > 2. Error: spark.glm and predict (@test_basic.R#58)
>> > >
>> > >
>> > >
>> > > ---------- Forwarded message ---------
>> > > Date: Mon, Nov 5, 2018, 10:12
>> > > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>> > >
>> > > Dear maintainer,
>> > >
>> > > package SparkR_2.4.0.tar.gz does not pass the incoming checks 
>> > > automatically, please see the following pre-tests:
>> > > Windows: 
>> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
>> > > Status: 1 NOTE
>> > > Debian: 
>> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
>> > > Status: 1 WARNING, 1 NOTE
>> > >
>> > > Last released version's CRAN status: ERROR: 1, OK: 1
>> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>> > >
>> > > CRAN Web: <https://cran.r-project.org/package=SparkR>
>> > >
>> > > Please fix all problems and resubmit a fixed version via the webform.
>> > > If you are not sure how to fix the problems shown, please ask for help 
>> > > on the R-package-devel mailing list:
>> > > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
>> > > If you are fairly certain the rejection is a false positive, please 
>> > > reply-all to this message and explain.
>> > >
>> > > More details are given in the directory:
>> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
>> > > The files will be removed after roughly 7 days.
>> > >
>> > > No strong reverse dependencies to be checked.
>> > >
>> > > Best regards,
>> > > CRAN teams' auto-check service
>> > > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
>> > > Check: CRAN incoming feasibility, Result: NOTE
>> > > Maintainer: 'Shivaram Venkataraman <shiva...@cs.berkeley.edu>'
>> > >
>> > > New submission
>> > >
>> > > Package was archived on CRAN
>> > >
>> > > Possibly mis-spelled words in DESCRIPTION:
>> > > Frontend (4:10, 5:28)
>> > >
>> > > CRAN repository db overrides:
>> > > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
>> > > corrected despite reminders.
>> > >
>> > > Flavor: r-devel-linux-x86_64-debian-gcc
>> > > Check: re-building of vignette outputs, Result: WARNING
>> > > Error in re-building vignettes:
>> > > ...
>> > >
>> > > Attaching package: 'SparkR'
>> > >
>> > > The following objects are masked from 'package:stats':
>> > >
>> > > cov, filter, lag, na.omit, predict, sd, var, window
>> > >
>> > > The following objects are masked from 'package:base':
>> > >
>> > > as.data.frame, colnames, colnames<-, drop, endsWith,
>> > > intersect, rank, rbind, sample, startsWith, subset, summary,
>> > > transform, union
>> > >
>> > > trying URL 
>> > > 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
>> > > Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
>> > > ==================================================
>> > > downloaded 217.3 MB
>> > >
>> > > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
>> > > Error: processing vignette 'sparkr-vignettes.Rmd' failed with 
>> > > diagnostics:
>> > > Java version 8 is required for this package; found version: 11.0.1
>> > > Execution halted
>>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to