That’s great!

________________________________
From: ☼ R Nair <ravishankar.n...@gmail.com>
Sent: Saturday, August 24, 2019 10:57:31 AM
To: Dongjoon Hyun <dongjoon.h...@gmail.com>
Cc: dev@spark.apache.org <dev@spark.apache.org>; user @spark/'user 
@spark'/spark users/user@spark <u...@spark.apache.org>
Subject: Re: JDK11 Support in Apache Spark

Finally!!! Congrats

On Sat, Aug 24, 2019, 11:11 AM Dongjoon Hyun 
<dongjoon.h...@gmail.com<mailto:dongjoon.h...@gmail.com>> wrote:
Hi, All.

Thanks to your many many contributions,
Apache Spark master branch starts to pass on JDK11 as of today.
(with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)

    
https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-3.2-jdk-11/326/
    (JDK11 is used for building and testing.)

We already verified all UTs (including PySpark/SparkR) before.

Please feel free to use JDK11 in order to build/test/run `master` branch and
share your experience including any issues. It will help Apache Spark 3.0.0 
release.

For the follow-ups, please follow 
https://issues.apache.org/jira/browse/SPARK-24417 .
The next step is `how to support JDK8/JDK11 together in a single artifact`.

Bests,
Dongjoon.

Reply via email to