Scala 2.11 is EOL, and only Scala 2.12 will support JDK 11 
https://github.com/scala/scala-dev/issues/559#issuecomment-436160166 
<https://github.com/scala/scala-dev/issues/559#issuecomment-436160166> , we 
might need to make Scala 2.12 as default version in Spark 3.0 to move forward. 

Given Oracle's new 6-month release model, I think the only realistic option is 
to only support and test LTS JDK. I'll send out two separate emails to dev to 
facilitate the discussion. 

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

> On Nov 6, 2018, at 9:47 AM, shane knapp <skn...@berkeley.edu> wrote:
> 
> cool, i was wondering when we were going to forge ahead in to the great 
> future of jdk8++...  i went ahead and created a sub-task of installing a 
> newer version of java on the build nodes 
> (https://issues.apache.org/jira/browse/SPARK-25953 
> <https://issues.apache.org/jira/browse/SPARK-25953>), and once we figure out 
> exact what version we want i'll go ahead and get that done.
> 
> On Tue, Nov 6, 2018 at 9:11 AM Sean Owen <sro...@gmail.com 
> <mailto:sro...@gmail.com>> wrote:
> I think that Java 9 support basically gets Java 10, 11 support. But
> the jump from 8 to 9 is unfortunately more breaking than usual because
> of the total revamping of the internal JDK classes. I think it will be
> mostly a matter of dependencies needing updates to work. I agree this
> is probably pretty important for Spark 3. Here's the ticket I know of:
> https://issues.apache.org/jira/browse/SPARK-24417 
> <https://issues.apache.org/jira/browse/SPARK-24417> . DB is already
> working on some of it, I see.
> On Tue, Nov 6, 2018 at 10:59 AM Felix Cheung <felixcheun...@hotmail.com 
> <mailto:felixcheun...@hotmail.com>> wrote:
> >
> > Speaking of, get we work to support Java 11?
> > That will fix all the problems below.
> >
> >
> >
> > ________________________________
> > From: Felix Cheung <felixcheun...@hotmail.com 
> > <mailto:felixcheun...@hotmail.com>>
> > Sent: Tuesday, November 6, 2018 8:57 AM
> > To: Wenchen Fan
> > Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >
> > We have not been able to publish to CRAN for quite some time (since 2.3.0 
> > was archived - the cause is Java 11)
> >
> > I think it’s ok to announce the release of 2.4.0
> >
> >
> > ________________________________
> > From: Wenchen Fan <cloud0...@gmail.com <mailto:cloud0...@gmail.com>>
> > Sent: Tuesday, November 6, 2018 8:51 AM
> > To: Felix Cheung
> > Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >
> > Do you mean we should have a 2.4.0 release without CRAN and then do a 2.4.1 
> > immediately?
> >
> > On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung <felixcheun...@hotmail.com 
> > <mailto:felixcheun...@hotmail.com>> wrote:
> >>
> >> Shivaram and I were discussing.
> >> Actually we worked with them before. Another possible approach is to 
> >> remove the vignettes eval and all test from the source package... in the 
> >> next release.
> >>
> >>
> >> ________________________________
> >> From: Matei Zaharia <matei.zaha...@gmail.com 
> >> <mailto:matei.zaha...@gmail.com>>
> >> Sent: Tuesday, November 6, 2018 12:07 AM
> >> To: Felix Cheung
> >> Cc: Sean Owen; dev; Shivaram Venkataraman
> >> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >>
> >> Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps 
> >> we aren’t disabling it correctly, or perhaps they can ignore this specific 
> >> failure. +Shivaram who might have some ideas.
> >>
> >> Matei
> >>
> >> > On Nov 5, 2018, at 9:09 PM, Felix Cheung <felixcheun...@hotmail.com 
> >> > <mailto:felixcheun...@hotmail.com>> wrote:
> >> >
> >> > I don¡Št know what the cause is yet.
> >> >
> >> > The test should be skipped because of this check
> >> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
> >> >  
> >> > <https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21>
> >> >
> >> > And this
> >> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
> >> >  
> >> > <https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57>
> >> >
> >> > But it ran:
> >> > callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", 
> >> > "fit", formula,
> >> >
> >> > The earlier release was achived because of Java 11+ too so this 
> >> > unfortunately isn¡Št new.
> >> >
> >> >
> >> > From: Sean Owen <sro...@gmail.com <mailto:sro...@gmail.com>>
> >> > Sent: Monday, November 5, 2018 7:22 PM
> >> > To: Felix Cheung
> >> > Cc: dev
> >> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >> >
> >> > What can we do to get the release through? is there any way to
> >> > circumvent these tests or otherwise hack it? or does it need a
> >> > maintenance release?
> >> > On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <felixcheun...@hotmail.com 
> >> > <mailto:felixcheun...@hotmail.com>> wrote:
> >> > >
> >> > > FYI. SparkR submission failed. It seems to detect Java 11 correctly 
> >> > > with vignettes but not skipping tests as would be expected.
> >> > >
> >> > > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with 
> >> > > diagnostics:
> >> > > Java version 8 is required for this package; found version: 11.0.1
> >> > > Execution halted
> >> > >
> >> > > * checking PDF version of manual ... OK
> >> > > * DONE
> >> > > Status: 1 WARNING, 1 NOTE
> >> > >
> >> > > Current CRAN status: ERROR: 1, OK: 1
> >> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html 
> >> > > <https://cran.r-project.org/web/checks/check_results_SparkR.html>>
> >> > >
> >> > > Version: 2.3.0
> >> > > Check: tests, Result: ERROR
> >> > > Running ¡¥run-all.R¡Š [8s/35s]
> >> > > Running the tests in ¡¥tests/run-all.R¡Š failed.
> >> > > Last 13 lines of output:
> >> > > 4: 
> >> > > callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper",
> >> > >  "fit", formula,
> >> > > data@sdf, tolower(family$family), family$link, tol, 
> >> > > as.integer(maxIter), weightCol,
> >> > > regParam, as.double(var.power), as.double(link.power), 
> >> > > stringIndexerOrderType,
> >> > > offsetCol)
> >> > > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
> >> > > 6: handleErrors(returnStatus, conn)
> >> > > 7: stop(readString(conn))
> >> > >
> >> > > ùùùù testthat results 
> >> > > ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
> >> > > OK: 0 SKIPPED: 0 FAILED: 2
> >> > > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
> >> > > 2. Error: spark.glm and predict (@test_basic.R#58)
> >> > >
> >> > >
> >> > >
> >> > > ---------- Forwarded message ---------
> >> > > Date: Mon, Nov 5, 2018, 10:12
> >> > > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >> > >
> >> > > Dear maintainer,
> >> > >
> >> > > package SparkR_2.4.0.tar.gz does not pass the incoming checks 
> >> > > automatically, please see the following pre-tests:
> >> > > Windows: 
> >> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log
> >> > >  
> >> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>>
> >> > > Status: 1 NOTE
> >> > > Debian: 
> >> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log
> >> > >  
> >> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>>
> >> > > Status: 1 WARNING, 1 NOTE
> >> > >
> >> > > Last released version's CRAN status: ERROR: 1, OK: 1
> >> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html 
> >> > > <https://cran.r-project.org/web/checks/check_results_SparkR.html>>
> >> > >
> >> > > CRAN Web: <https://cran.r-project.org/package=SparkR 
> >> > > <https://cran.r-project.org/package=SparkR>>
> >> > >
> >> > > Please fix all problems and resubmit a fixed version via the webform.
> >> > > If you are not sure how to fix the problems shown, please ask for help 
> >> > > on the R-package-devel mailing list:
> >> > > <https://stat.ethz.ch/mailman/listinfo/r-package-devel 
> >> > > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>>
> >> > > If you are fairly certain the rejection is a false positive, please 
> >> > > reply-all to this message and explain.
> >> > >
> >> > > More details are given in the directory:
> >> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/
> >> > >  
> >> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>>
> >> > > The files will be removed after roughly 7 days.
> >> > >
> >> > > No strong reverse dependencies to be checked.
> >> > >
> >> > > Best regards,
> >> > > CRAN teams' auto-check service
> >> > > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
> >> > > Check: CRAN incoming feasibility, Result: NOTE
> >> > > Maintainer: 'Shivaram Venkataraman <shiva...@cs.berkeley.edu 
> >> > > <mailto:shiva...@cs.berkeley.edu>>'
> >> > >
> >> > > New submission
> >> > >
> >> > > Package was archived on CRAN
> >> > >
> >> > > Possibly mis-spelled words in DESCRIPTION:
> >> > > Frontend (4:10, 5:28)
> >> > >
> >> > > CRAN repository db overrides:
> >> > > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
> >> > > corrected despite reminders.
> >> > >
> >> > > Flavor: r-devel-linux-x86_64-debian-gcc
> >> > > Check: re-building of vignette outputs, Result: WARNING
> >> > > Error in re-building vignettes:
> >> > > ...
> >> > >
> >> > > Attaching package: 'SparkR'
> >> > >
> >> > > The following objects are masked from 'package:stats':
> >> > >
> >> > > cov, filter, lag, na.omit, predict, sd, var, window
> >> > >
> >> > > The following objects are masked from 'package:base':
> >> > >
> >> > > as.data.frame, colnames, colnames<-, drop, endsWith,
> >> > > intersect, rank, rbind, sample, startsWith, subset, summary,
> >> > > transform, union
> >> > >
> >> > > trying URL 
> >> > > 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz
> >> > >  
> >> > > <http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz>'
> >> > > Content type 'application/octet-stream' length 227893062 bytes (217.3 
> >> > > MB)
> >> > > ==================================================
> >> > > downloaded 217.3 MB
> >> > >
> >> > > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
> >> > > Error: processing vignette 'sparkr-vignettes.Rmd' failed with 
> >> > > diagnostics:
> >> > > Java version 8 is required for this package; found version: 11.0.1
> >> > > Execution halted
> >>
> 
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org 
> <mailto:dev-unsubscr...@spark.apache.org>
> 
> 
> 
> -- 
> Shane Knapp
> UC Berkeley EECS Research / RISELab Staff Technical Lead
> https://rise.cs.berkeley.edu <https://rise.cs.berkeley.edu/>

Reply via email to