Re: JDK11 Support in Apache Spark

2019-08-27 Thread Hyukjin Kwon
YaY!

2019년 8월 27일 (화) 오후 3:36, Dongjoon Hyun 님이 작성:

> Hi, All.
>
> Thank you for your attention!
>
> UPDATE: We succeeded to build with JDK8 and test with JDK11.
>
> - https://github.com/apache/spark/pull/25587
> -
> https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/4842
> (Scala/Java/Python/R)
>
> We are ready to release Maven artifacts as a single artifact for both JDK8
> and JDK11.
>
> According to this email thread, I believe this is the last piece to
> resolve the following issue.
>
> https://issues.apache.org/jira/browse/SPARK-24417 (Build and Run
> Spark on JDK11)
>
> To committers, please use `[test-hadoop3.2][test-java11]` to verify JDK11
> compatibility on the relevant PRs.
>
> Bests,
> Dongjoon.
>


Re: JDK11 Support in Apache Spark

2019-08-27 Thread Dongjoon Hyun
Hi, All.

Thank you for your attention!

UPDATE: We succeeded to build with JDK8 and test with JDK11.

- https://github.com/apache/spark/pull/25587
-
https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/4842
(Scala/Java/Python/R)

We are ready to release Maven artifacts as a single artifact for both JDK8
and JDK11.

According to this email thread, I believe this is the last piece to resolve
the following issue.

https://issues.apache.org/jira/browse/SPARK-24417 (Build and Run Spark
on JDK11)

To committers, please use `[test-hadoop3.2][test-java11]` to verify JDK11
compatibility on the relevant PRs.

Bests,
Dongjoon.


Re: JDK11 Support in Apache Spark

2019-08-26 Thread Sean Owen
Bringing a side conversation back to main: good news / bad news.

We most definitely want one build to run on JDK 8 and JDK 11. That is
actually what both of the JDK 11 jobs do right now, so I believe the
passing Jenkins job suggests that already works.

The downside is I think we haven't necessarily fully debugged Pyspark on
JDK 11, although what's left looks minor, weird, and like it could some
weird build env issue. That is, those jobs do not run Pyspark (and
actually, I think that's a problem with several existing jobs: they don't
run the run-tests script, just run tests via Maven).

It's definitely a milestone and a whole lot clearly works. The finishing
touch may be just build job cleanup.


On Mon, Aug 26, 2019 at 12:23 PM Reynold Xin  wrote:

> Would it be possible to have one build that works for both?
>
> On Mon, Aug 26, 2019 at 10:22 AM Dongjoon Hyun 
> wrote:
>
>> Thank you all!
>>
>> Let me add more explanation on the current status.
>>
>> - If you want to run on JDK8, you need to build on JDK8
>> - If you want to run on JDK11, you need to build on JDK11.
>>
>> The other combinations will not work.
>>
>> Currently, we have two Jenkins jobs. (1) is the one I pointed, and (2) is
>> the one for the remaining community work.
>>
>> 1) Build and test on JDK11 (spark-master-test-maven-hadoop-3.2-jdk-11)
>> 2) Build on JDK8 and test on JDK11
>> (spark-master-test-maven-hadoop-2.7-jdk-11-ubuntu-testing)
>>
>> To keep JDK11 compatibility, the following is merged today.
>>
>> [SPARK-28701][TEST-HADOOP3.2][TEST-JAVA11][K8S] adding java11
>> support for pull request builds
>>
>> But, we still have many stuffs to do for Jenkins/Release and we need your
>> support about JDK11. :)
>>
>> Bests,
>> Dongjoon.
>>
>>
>> On Sun, Aug 25, 2019 at 10:30 PM Takeshi Yamamuro 
>> wrote:
>>
>>> Cool, congrats!
>>>
>>> Bests,
>>> Takeshi
>>>
>>> On Mon, Aug 26, 2019 at 1:01 PM Hichame El Khalfi 
>>> wrote:
>>>
>>>> That's Awesome !!!
>>>>
>>>> Thanks to everyone that made this possible :cheers:
>>>>
>>>> Hichame
>>>>
>>>> *From:* cloud0...@gmail.com
>>>> *Sent:* August 25, 2019 10:43 PM
>>>> *To:* lix...@databricks.com
>>>> *Cc:* felixcheun...@hotmail.com; ravishankar.n...@gmail.com;
>>>> dongjoon.h...@gmail.com; dev@spark.apache.org; u...@spark.apache.org
>>>> *Subject:* Re: JDK11 Support in Apache Spark
>>>>
>>>> Great work!
>>>>
>>>> On Sun, Aug 25, 2019 at 6:03 AM Xiao Li  wrote:
>>>>
>>>>> Thank you for your contributions! This is a great feature for Spark
>>>>> 3.0! We finally achieve it!
>>>>>
>>>>> Xiao
>>>>>
>>>>> On Sat, Aug 24, 2019 at 12:18 PM Felix Cheung <
>>>>> felixcheun...@hotmail.com> wrote:
>>>>>
>>>>>> That’s great!
>>>>>>
>>>>>> --
>>>>>> *From:* ☼ R Nair 
>>>>>> *Sent:* Saturday, August 24, 2019 10:57:31 AM
>>>>>> *To:* Dongjoon Hyun 
>>>>>> *Cc:* dev@spark.apache.org ; user @spark/'user
>>>>>> @spark'/spark users/user@spark 
>>>>>> *Subject:* Re: JDK11 Support in Apache Spark
>>>>>>
>>>>>> Finally!!! Congrats
>>>>>>
>>>>>> On Sat, Aug 24, 2019, 11:11 AM Dongjoon Hyun 
>>>>>> wrote:
>>>>>>
>>>>>>> Hi, All.
>>>>>>>
>>>>>>> Thanks to your many many contributions,
>>>>>>> Apache Spark master branch starts to pass on JDK11 as of today.
>>>>>>> (with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)
>>>>>>>
>>>>>>>
>>>>>>> https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-3.2-jdk-11/326/
>>>>>>> (JDK11 is used for building and testing.)
>>>>>>>
>>>>>>> We already verified all UTs (including PySpark/SparkR) before.
>>>>>>>
>>>>>>> Please feel free to use JDK11 in order to build/test/run `master`
>>>>>>> branch and
>>>>>>> share your experience including any issues. It will help Apache
>>>>>>> Spark 3.0.0 release.
>>>>>>>
>>>>>>> For the follow-ups, please follow
>>>>>>> https://issues.apache.org/jira/browse/SPARK-24417 .
>>>>>>> The next step is `how to support JDK8/JDK11 together in a single
>>>>>>> artifact`.
>>>>>>>
>>>>>>> Bests,
>>>>>>> Dongjoon.
>>>>>>>
>>>>>>
>>>>>
>>>>> --
>>>>> [image: Databricks Summit - Watch the talks]
>>>>> <https://databricks.com/sparkaisummit/north-america>
>>>>>
>>>>
>>>
>>> --
>>> ---
>>> Takeshi Yamamuro
>>>
>>


Re: JDK11 Support in Apache Spark

2019-08-26 Thread Matei Zaharia
+1, it’s super messy without that. But great to see this running!

> On Aug 26, 2019, at 10:53 AM, Reynold Xin  wrote:
> 
> Exactly - I think it's important to be able to create a single binary build. 
> Otherwise downstream users (the 99.99% won't be building their own Spark but 
> just pull it from Maven) will have to deal with the mess, and it's even worse 
> for libraries.
> 
> 
> On Mon, Aug 26, 2019 at 10:51 AM, Dongjoon Hyun  > wrote:
> Oh, right. If you want to publish something to Maven, it will inherit the 
> situation.
> Thank you for feedback. :)
> 
> On Mon, Aug 26, 2019 at 10:37 AM Michael Heuer  > wrote:
> That is not true for any downstream users who also provide a library.  
> Whatever build mess you create in Apache Spark, we'll have to inherit it.  ;)
> 
>michael
> 
> 
>> On Aug 26, 2019, at 12:32 PM, Dongjoon Hyun > > wrote:
>> 
>> As Shane wrote, not yet.
>> 
>> `one build for works for both` is our aspiration and the next step mentioned 
>> in the first email.
>> 
>> > The next step is `how to support JDK8/JDK11 together in a single artifact`.
>> 
>> For the downstream users who build from the Apache Spark source, that will 
>> not be a blocker because they will prefer a single JDK.
>> 
>> Bests,
>> Dongjoon.
>> 
>> On Mon, Aug 26, 2019 at 10:28 AM Shane Knapp > > wrote:
>> maybe in the future, but not right now as the hadoop 2.7 build is broken.
>> 
>> also, i busted dev/run-tests.py  in my changes to 
>> support java11 in PRBs:
>> https://github.com/apache/spark/pull/25585 
>> 
>> 
>> quick fix, testing now.
>> 
>> On Mon, Aug 26, 2019 at 10:23 AM Reynold Xin > > wrote:
>> Would it be possible to have one build that works for both?
> 



Re: JDK11 Support in Apache Spark

2019-08-26 Thread Reynold Xin
Exactly - I think it's important to be able to create a single binary build. 
Otherwise downstream users (the 99.99% won't be building their own Spark but 
just pull it from Maven) will have to deal with the mess, and it's even worse 
for libraries.

On Mon, Aug 26, 2019 at 10:51 AM, Dongjoon Hyun < dongjoon.h...@gmail.com > 
wrote:

> 
> Oh, right. If you want to publish something to Maven, it will inherit the
> situation.
> Thank you for feedback. :)
> 
> On Mon, Aug 26, 2019 at 10:37 AM Michael Heuer < heuermh@ gmail. com (
> heue...@gmail.com ) > wrote:
> 
> 
>> That is not true for any downstream users who also provide a library. 
>> Whatever build mess you create in Apache Spark, we'll have to inherit it. 
>> ;)
>> 
>> 
>>    michael
>> 
>> 
>> 
>> 
>>> On Aug 26, 2019, at 12:32 PM, Dongjoon Hyun < dongjoon. hyun@ gmail. com (
>>> dongjoon.h...@gmail.com ) > wrote:
>>> 
>>> As Shane wrote, not yet.
>>> 
>>> 
>>> `one build for works for both` is our aspiration and the next step
>>> mentioned in the first email.
>>> 
>>> 
>>> 
>>> > The next step is `how to support JDK8/JDK11 together in a single
>>> artifact`.
>>> 
>>> 
>>> For the downstream users who build from the Apache Spark source, that will
>>> not be a blocker because they will prefer a single JDK.
>>> 
>>> 
>>> 
>>> Bests,
>>> Dongjoon.
>>> 
>>> On Mon, Aug 26, 2019 at 10:28 AM Shane Knapp < sknapp@ berkeley. edu (
>>> skn...@berkeley.edu ) > wrote:
>>> 
>>> 
 maybe in the future, but not right now as the hadoop 2.7 build is broken.
 
 
 also, i busted dev/ run-tests. py ( http://dev/run-tests.py ) in my changes
 to support java11 in PRBs:
 https:/ / github. com/ apache/ spark/ pull/ 25585 (
 https://github.com/apache/spark/pull/25585 )
 
 
 
 quick fix, testing now.
 
 On Mon, Aug 26, 2019 at 10:23 AM Reynold Xin < rxin@ databricks. com (
 r...@databricks.com ) > wrote:
 
 
> Would it be possible to have one build that works for both?
> 
 
 
>>> 
>>> 
>> 
>> 
> 
>

Re: JDK11 Support in Apache Spark

2019-08-26 Thread Dongjoon Hyun
Oh, right. If you want to publish something to Maven, it will inherit the
situation.
Thank you for feedback. :)

On Mon, Aug 26, 2019 at 10:37 AM Michael Heuer  wrote:

> That is not true for any downstream users who also provide a library.
> Whatever build mess you create in Apache Spark, we'll have to inherit it.
>  ;)
>
>michael
>
>
> On Aug 26, 2019, at 12:32 PM, Dongjoon Hyun 
> wrote:
>
> As Shane wrote, not yet.
>
> `one build for works for both` is our aspiration and the next step
> mentioned in the first email.
>
> > The next step is `how to support JDK8/JDK11 together in a single
> artifact`.
>
> For the downstream users who build from the Apache Spark source, that will
> not be a blocker because they will prefer a single JDK.
>
> Bests,
> Dongjoon.
>
> On Mon, Aug 26, 2019 at 10:28 AM Shane Knapp  wrote:
>
>> maybe in the future, but not right now as the hadoop 2.7 build is broken.
>>
>> also, i busted dev/run-tests.py in my changes to support java11 in PRBs:
>> https://github.com/apache/spark/pull/25585
>>
>> quick fix, testing now.
>>
>> On Mon, Aug 26, 2019 at 10:23 AM Reynold Xin  wrote:
>>
>>> Would it be possible to have one build that works for both?
>>>
>>>
>>
>


Re: JDK11 Support in Apache Spark

2019-08-26 Thread Michael Heuer
That is not true for any downstream users who also provide a library.  Whatever 
build mess you create in Apache Spark, we'll have to inherit it.  ;)

   michael


> On Aug 26, 2019, at 12:32 PM, Dongjoon Hyun  wrote:
> 
> As Shane wrote, not yet.
> 
> `one build for works for both` is our aspiration and the next step mentioned 
> in the first email.
> 
> > The next step is `how to support JDK8/JDK11 together in a single artifact`.
> 
> For the downstream users who build from the Apache Spark source, that will 
> not be a blocker because they will prefer a single JDK.
> 
> Bests,
> Dongjoon.
> 
> On Mon, Aug 26, 2019 at 10:28 AM Shane Knapp  > wrote:
> maybe in the future, but not right now as the hadoop 2.7 build is broken.
> 
> also, i busted dev/run-tests.py in my changes to support java11 in PRBs:
> https://github.com/apache/spark/pull/25585 
> 
> 
> quick fix, testing now.
> 
> On Mon, Aug 26, 2019 at 10:23 AM Reynold Xin  > wrote:
> Would it be possible to have one build that works for both?
> 
> 



Re: JDK11 Support in Apache Spark

2019-08-26 Thread Shane Knapp
maybe in the future, but not right now as the hadoop 2.7 build is broken.

also, i busted dev/run-tests.py in my changes to support java11 in PRBs:
https://github.com/apache/spark/pull/25585

quick fix, testing now.

On Mon, Aug 26, 2019 at 10:23 AM Reynold Xin  wrote:

> Would it be possible to have one build that works for both?
>
> On Mon, Aug 26, 2019 at 10:22 AM Dongjoon Hyun 
> wrote:
>
>> Thank you all!
>>
>> Let me add more explanation on the current status.
>>
>> - If you want to run on JDK8, you need to build on JDK8
>> - If you want to run on JDK11, you need to build on JDK11.
>>
>> The other combinations will not work.
>>
>> Currently, we have two Jenkins jobs. (1) is the one I pointed, and (2) is
>> the one for the remaining community work.
>>
>> 1) Build and test on JDK11 (spark-master-test-maven-hadoop-3.2-jdk-11)
>> 2) Build on JDK8 and test on JDK11
>> (spark-master-test-maven-hadoop-2.7-jdk-11-ubuntu-testing)
>>
>> To keep JDK11 compatibility, the following is merged today.
>>
>> [SPARK-28701][TEST-HADOOP3.2][TEST-JAVA11][K8S] adding java11
>> support for pull request builds
>>
>> But, we still have many stuffs to do for Jenkins/Release and we need your
>> support about JDK11. :)
>>
>> Bests,
>> Dongjoon.
>>
>>
>> On Sun, Aug 25, 2019 at 10:30 PM Takeshi Yamamuro 
>> wrote:
>>
>>> Cool, congrats!
>>>
>>> Bests,
>>> Takeshi
>>>
>>> On Mon, Aug 26, 2019 at 1:01 PM Hichame El Khalfi 
>>> wrote:
>>>
>>>> That's Awesome !!!
>>>>
>>>> Thanks to everyone that made this possible :cheers:
>>>>
>>>> Hichame
>>>>
>>>> *From:* cloud0...@gmail.com
>>>> *Sent:* August 25, 2019 10:43 PM
>>>> *To:* lix...@databricks.com
>>>> *Cc:* felixcheun...@hotmail.com; ravishankar.n...@gmail.com;
>>>> dongjoon.h...@gmail.com; dev@spark.apache.org; u...@spark.apache.org
>>>> *Subject:* Re: JDK11 Support in Apache Spark
>>>>
>>>> Great work!
>>>>
>>>> On Sun, Aug 25, 2019 at 6:03 AM Xiao Li  wrote:
>>>>
>>>>> Thank you for your contributions! This is a great feature for Spark
>>>>> 3.0! We finally achieve it!
>>>>>
>>>>> Xiao
>>>>>
>>>>> On Sat, Aug 24, 2019 at 12:18 PM Felix Cheung <
>>>>> felixcheun...@hotmail.com> wrote:
>>>>>
>>>>>> That’s great!
>>>>>>
>>>>>> --
>>>>>> *From:* ☼ R Nair 
>>>>>> *Sent:* Saturday, August 24, 2019 10:57:31 AM
>>>>>> *To:* Dongjoon Hyun 
>>>>>> *Cc:* dev@spark.apache.org ; user @spark/'user
>>>>>> @spark'/spark users/user@spark 
>>>>>> *Subject:* Re: JDK11 Support in Apache Spark
>>>>>>
>>>>>> Finally!!! Congrats
>>>>>>
>>>>>> On Sat, Aug 24, 2019, 11:11 AM Dongjoon Hyun 
>>>>>> wrote:
>>>>>>
>>>>>>> Hi, All.
>>>>>>>
>>>>>>> Thanks to your many many contributions,
>>>>>>> Apache Spark master branch starts to pass on JDK11 as of today.
>>>>>>> (with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)
>>>>>>>
>>>>>>>
>>>>>>> https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-3.2-jdk-11/326/
>>>>>>> (JDK11 is used for building and testing.)
>>>>>>>
>>>>>>> We already verified all UTs (including PySpark/SparkR) before.
>>>>>>>
>>>>>>> Please feel free to use JDK11 in order to build/test/run `master`
>>>>>>> branch and
>>>>>>> share your experience including any issues. It will help Apache
>>>>>>> Spark 3.0.0 release.
>>>>>>>
>>>>>>> For the follow-ups, please follow
>>>>>>> https://issues.apache.org/jira/browse/SPARK-24417 .
>>>>>>> The next step is `how to support JDK8/JDK11 together in a single
>>>>>>> artifact`.
>>>>>>>
>>>>>>> Bests,
>>>>>>> Dongjoon.
>>>>>>>
>>>>>>
>>>>>
>>>>> --
>>>>> [image: Databricks Summit - Watch the talks]
>>>>> <https://databricks.com/sparkaisummit/north-america>
>>>>>
>>>>
>>>
>>> --
>>> ---
>>> Takeshi Yamamuro
>>>
>>

-- 
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu


Re: JDK11 Support in Apache Spark

2019-08-26 Thread Reynold Xin
Would it be possible to have one build that works for both?

On Mon, Aug 26, 2019 at 10:22 AM Dongjoon Hyun 
wrote:

> Thank you all!
>
> Let me add more explanation on the current status.
>
> - If you want to run on JDK8, you need to build on JDK8
> - If you want to run on JDK11, you need to build on JDK11.
>
> The other combinations will not work.
>
> Currently, we have two Jenkins jobs. (1) is the one I pointed, and (2) is
> the one for the remaining community work.
>
> 1) Build and test on JDK11 (spark-master-test-maven-hadoop-3.2-jdk-11)
> 2) Build on JDK8 and test on JDK11
> (spark-master-test-maven-hadoop-2.7-jdk-11-ubuntu-testing)
>
> To keep JDK11 compatibility, the following is merged today.
>
> [SPARK-28701][TEST-HADOOP3.2][TEST-JAVA11][K8S] adding java11
> support for pull request builds
>
> But, we still have many stuffs to do for Jenkins/Release and we need your
> support about JDK11. :)
>
> Bests,
> Dongjoon.
>
>
> On Sun, Aug 25, 2019 at 10:30 PM Takeshi Yamamuro 
> wrote:
>
>> Cool, congrats!
>>
>> Bests,
>> Takeshi
>>
>> On Mon, Aug 26, 2019 at 1:01 PM Hichame El Khalfi 
>> wrote:
>>
>>> That's Awesome !!!
>>>
>>> Thanks to everyone that made this possible :cheers:
>>>
>>> Hichame
>>>
>>> *From:* cloud0...@gmail.com
>>> *Sent:* August 25, 2019 10:43 PM
>>> *To:* lix...@databricks.com
>>> *Cc:* felixcheun...@hotmail.com; ravishankar.n...@gmail.com;
>>> dongjoon.h...@gmail.com; dev@spark.apache.org; u...@spark.apache.org
>>> *Subject:* Re: JDK11 Support in Apache Spark
>>>
>>> Great work!
>>>
>>> On Sun, Aug 25, 2019 at 6:03 AM Xiao Li  wrote:
>>>
>>>> Thank you for your contributions! This is a great feature for Spark
>>>> 3.0! We finally achieve it!
>>>>
>>>> Xiao
>>>>
>>>> On Sat, Aug 24, 2019 at 12:18 PM Felix Cheung <
>>>> felixcheun...@hotmail.com> wrote:
>>>>
>>>>> That’s great!
>>>>>
>>>>> --
>>>>> *From:* ☼ R Nair 
>>>>> *Sent:* Saturday, August 24, 2019 10:57:31 AM
>>>>> *To:* Dongjoon Hyun 
>>>>> *Cc:* dev@spark.apache.org ; user @spark/'user
>>>>> @spark'/spark users/user@spark 
>>>>> *Subject:* Re: JDK11 Support in Apache Spark
>>>>>
>>>>> Finally!!! Congrats
>>>>>
>>>>> On Sat, Aug 24, 2019, 11:11 AM Dongjoon Hyun 
>>>>> wrote:
>>>>>
>>>>>> Hi, All.
>>>>>>
>>>>>> Thanks to your many many contributions,
>>>>>> Apache Spark master branch starts to pass on JDK11 as of today.
>>>>>> (with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)
>>>>>>
>>>>>>
>>>>>> https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-3.2-jdk-11/326/
>>>>>> (JDK11 is used for building and testing.)
>>>>>>
>>>>>> We already verified all UTs (including PySpark/SparkR) before.
>>>>>>
>>>>>> Please feel free to use JDK11 in order to build/test/run `master`
>>>>>> branch and
>>>>>> share your experience including any issues. It will help Apache Spark
>>>>>> 3.0.0 release.
>>>>>>
>>>>>> For the follow-ups, please follow
>>>>>> https://issues.apache.org/jira/browse/SPARK-24417 .
>>>>>> The next step is `how to support JDK8/JDK11 together in a single
>>>>>> artifact`.
>>>>>>
>>>>>> Bests,
>>>>>> Dongjoon.
>>>>>>
>>>>>
>>>>
>>>> --
>>>> [image: Databricks Summit - Watch the talks]
>>>> <https://databricks.com/sparkaisummit/north-america>
>>>>
>>>
>>
>> --
>> ---
>> Takeshi Yamamuro
>>
>


Re: JDK11 Support in Apache Spark

2019-08-26 Thread Dongjoon Hyun
Thank you all!

Let me add more explanation on the current status.

- If you want to run on JDK8, you need to build on JDK8
- If you want to run on JDK11, you need to build on JDK11.

The other combinations will not work.

Currently, we have two Jenkins jobs. (1) is the one I pointed, and (2) is
the one for the remaining community work.

1) Build and test on JDK11 (spark-master-test-maven-hadoop-3.2-jdk-11)
2) Build on JDK8 and test on JDK11
(spark-master-test-maven-hadoop-2.7-jdk-11-ubuntu-testing)

To keep JDK11 compatibility, the following is merged today.

[SPARK-28701][TEST-HADOOP3.2][TEST-JAVA11][K8S] adding java11
support for pull request builds

But, we still have many stuffs to do for Jenkins/Release and we need your
support about JDK11. :)

Bests,
Dongjoon.


On Sun, Aug 25, 2019 at 10:30 PM Takeshi Yamamuro 
wrote:

> Cool, congrats!
>
> Bests,
> Takeshi
>
> On Mon, Aug 26, 2019 at 1:01 PM Hichame El Khalfi 
> wrote:
>
>> That's Awesome !!!
>>
>> Thanks to everyone that made this possible :cheers:
>>
>> Hichame
>>
>> *From:* cloud0...@gmail.com
>> *Sent:* August 25, 2019 10:43 PM
>> *To:* lix...@databricks.com
>> *Cc:* felixcheun...@hotmail.com; ravishankar.n...@gmail.com;
>> dongjoon.h...@gmail.com; dev@spark.apache.org; u...@spark.apache.org
>> *Subject:* Re: JDK11 Support in Apache Spark
>>
>> Great work!
>>
>> On Sun, Aug 25, 2019 at 6:03 AM Xiao Li  wrote:
>>
>>> Thank you for your contributions! This is a great feature for Spark
>>> 3.0! We finally achieve it!
>>>
>>> Xiao
>>>
>>> On Sat, Aug 24, 2019 at 12:18 PM Felix Cheung 
>>> wrote:
>>>
>>>> That’s great!
>>>>
>>>> --------------
>>>> *From:* ☼ R Nair 
>>>> *Sent:* Saturday, August 24, 2019 10:57:31 AM
>>>> *To:* Dongjoon Hyun 
>>>> *Cc:* dev@spark.apache.org ; user @spark/'user
>>>> @spark'/spark users/user@spark 
>>>> *Subject:* Re: JDK11 Support in Apache Spark
>>>>
>>>> Finally!!! Congrats
>>>>
>>>> On Sat, Aug 24, 2019, 11:11 AM Dongjoon Hyun 
>>>> wrote:
>>>>
>>>>> Hi, All.
>>>>>
>>>>> Thanks to your many many contributions,
>>>>> Apache Spark master branch starts to pass on JDK11 as of today.
>>>>> (with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)
>>>>>
>>>>>
>>>>> https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-3.2-jdk-11/326/
>>>>> (JDK11 is used for building and testing.)
>>>>>
>>>>> We already verified all UTs (including PySpark/SparkR) before.
>>>>>
>>>>> Please feel free to use JDK11 in order to build/test/run `master`
>>>>> branch and
>>>>> share your experience including any issues. It will help Apache Spark
>>>>> 3.0.0 release.
>>>>>
>>>>> For the follow-ups, please follow
>>>>> https://issues.apache.org/jira/browse/SPARK-24417 .
>>>>> The next step is `how to support JDK8/JDK11 together in a single
>>>>> artifact`.
>>>>>
>>>>> Bests,
>>>>> Dongjoon.
>>>>>
>>>>
>>>
>>> --
>>> [image: Databricks Summit - Watch the talks]
>>> <https://databricks.com/sparkaisummit/north-america>
>>>
>>
>
> --
> ---
> Takeshi Yamamuro
>


Re: JDK11 Support in Apache Spark

2019-08-25 Thread Takeshi Yamamuro
Cool, congrats!

Bests,
Takeshi

On Mon, Aug 26, 2019 at 1:01 PM Hichame El Khalfi 
wrote:

> That's Awesome !!!
>
> Thanks to everyone that made this possible :cheers:
>
> Hichame
>
> *From:* cloud0...@gmail.com
> *Sent:* August 25, 2019 10:43 PM
> *To:* lix...@databricks.com
> *Cc:* felixcheun...@hotmail.com; ravishankar.n...@gmail.com;
> dongjoon.h...@gmail.com; dev@spark.apache.org; u...@spark.apache.org
> *Subject:* Re: JDK11 Support in Apache Spark
>
> Great work!
>
> On Sun, Aug 25, 2019 at 6:03 AM Xiao Li  wrote:
>
>> Thank you for your contributions! This is a great feature for Spark
>> 3.0! We finally achieve it!
>>
>> Xiao
>>
>> On Sat, Aug 24, 2019 at 12:18 PM Felix Cheung 
>> wrote:
>>
>>> That’s great!
>>>
>>> --
>>> *From:* ☼ R Nair 
>>> *Sent:* Saturday, August 24, 2019 10:57:31 AM
>>> *To:* Dongjoon Hyun 
>>> *Cc:* dev@spark.apache.org ; user @spark/'user
>>> @spark'/spark users/user@spark 
>>> *Subject:* Re: JDK11 Support in Apache Spark
>>>
>>> Finally!!! Congrats
>>>
>>> On Sat, Aug 24, 2019, 11:11 AM Dongjoon Hyun 
>>> wrote:
>>>
>>>> Hi, All.
>>>>
>>>> Thanks to your many many contributions,
>>>> Apache Spark master branch starts to pass on JDK11 as of today.
>>>> (with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)
>>>>
>>>>
>>>> https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-3.2-jdk-11/326/
>>>> (JDK11 is used for building and testing.)
>>>>
>>>> We already verified all UTs (including PySpark/SparkR) before.
>>>>
>>>> Please feel free to use JDK11 in order to build/test/run `master`
>>>> branch and
>>>> share your experience including any issues. It will help Apache Spark
>>>> 3.0.0 release.
>>>>
>>>> For the follow-ups, please follow
>>>> https://issues.apache.org/jira/browse/SPARK-24417 .
>>>> The next step is `how to support JDK8/JDK11 together in a single
>>>> artifact`.
>>>>
>>>> Bests,
>>>> Dongjoon.
>>>>
>>>
>>
>> --
>> [image: Databricks Summit - Watch the talks]
>> <https://databricks.com/sparkaisummit/north-america>
>>
>

-- 
---
Takeshi Yamamuro


Re: JDK11 Support in Apache Spark

2019-08-25 Thread Wenchen Fan
Great work!

On Sun, Aug 25, 2019 at 6:03 AM Xiao Li  wrote:

> Thank you for your contributions! This is a great feature for Spark
> 3.0! We finally achieve it!
>
> Xiao
>
> On Sat, Aug 24, 2019 at 12:18 PM Felix Cheung 
> wrote:
>
>> That’s great!
>>
>> --
>> *From:* ☼ R Nair 
>> *Sent:* Saturday, August 24, 2019 10:57:31 AM
>> *To:* Dongjoon Hyun 
>> *Cc:* dev@spark.apache.org ; user @spark/'user
>> @spark'/spark users/user@spark 
>> *Subject:* Re: JDK11 Support in Apache Spark
>>
>> Finally!!! Congrats
>>
>> On Sat, Aug 24, 2019, 11:11 AM Dongjoon Hyun 
>> wrote:
>>
>>> Hi, All.
>>>
>>> Thanks to your many many contributions,
>>> Apache Spark master branch starts to pass on JDK11 as of today.
>>> (with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)
>>>
>>>
>>> https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-3.2-jdk-11/326/
>>> (JDK11 is used for building and testing.)
>>>
>>> We already verified all UTs (including PySpark/SparkR) before.
>>>
>>> Please feel free to use JDK11 in order to build/test/run `master` branch
>>> and
>>> share your experience including any issues. It will help Apache Spark
>>> 3.0.0 release.
>>>
>>> For the follow-ups, please follow
>>> https://issues.apache.org/jira/browse/SPARK-24417 .
>>> The next step is `how to support JDK8/JDK11 together in a single
>>> artifact`.
>>>
>>> Bests,
>>> Dongjoon.
>>>
>>
>
> --
> [image: Databricks Summit - Watch the talks]
> <https://databricks.com/sparkaisummit/north-america>
>


Re: JDK11 Support in Apache Spark

2019-08-24 Thread Xiao Li
Thank you for your contributions! This is a great feature for Spark 3.0! We
finally achieve it!

Xiao

On Sat, Aug 24, 2019 at 12:18 PM Felix Cheung 
wrote:

> That’s great!
>
> --
> *From:* ☼ R Nair 
> *Sent:* Saturday, August 24, 2019 10:57:31 AM
> *To:* Dongjoon Hyun 
> *Cc:* dev@spark.apache.org ; user @spark/'user
> @spark'/spark users/user@spark 
> *Subject:* Re: JDK11 Support in Apache Spark
>
> Finally!!! Congrats
>
> On Sat, Aug 24, 2019, 11:11 AM Dongjoon Hyun 
> wrote:
>
>> Hi, All.
>>
>> Thanks to your many many contributions,
>> Apache Spark master branch starts to pass on JDK11 as of today.
>> (with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)
>>
>>
>> https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-3.2-jdk-11/326/
>> (JDK11 is used for building and testing.)
>>
>> We already verified all UTs (including PySpark/SparkR) before.
>>
>> Please feel free to use JDK11 in order to build/test/run `master` branch
>> and
>> share your experience including any issues. It will help Apache Spark
>> 3.0.0 release.
>>
>> For the follow-ups, please follow
>> https://issues.apache.org/jira/browse/SPARK-24417 .
>> The next step is `how to support JDK8/JDK11 together in a single
>> artifact`.
>>
>> Bests,
>> Dongjoon.
>>
>

-- 
[image: Databricks Summit - Watch the talks]
<https://databricks.com/sparkaisummit/north-america>


Re: JDK11 Support in Apache Spark

2019-08-24 Thread Felix Cheung
That’s great!


From: ☼ R Nair 
Sent: Saturday, August 24, 2019 10:57:31 AM
To: Dongjoon Hyun 
Cc: dev@spark.apache.org ; user @spark/'user 
@spark'/spark users/user@spark 
Subject: Re: JDK11 Support in Apache Spark

Finally!!! Congrats

On Sat, Aug 24, 2019, 11:11 AM Dongjoon Hyun 
mailto:dongjoon.h...@gmail.com>> wrote:
Hi, All.

Thanks to your many many contributions,
Apache Spark master branch starts to pass on JDK11 as of today.
(with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)


https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-3.2-jdk-11/326/
(JDK11 is used for building and testing.)

We already verified all UTs (including PySpark/SparkR) before.

Please feel free to use JDK11 in order to build/test/run `master` branch and
share your experience including any issues. It will help Apache Spark 3.0.0 
release.

For the follow-ups, please follow 
https://issues.apache.org/jira/browse/SPARK-24417 .
The next step is `how to support JDK8/JDK11 together in a single artifact`.

Bests,
Dongjoon.


Re: JDK11 Support in Apache Spark

2019-08-24 Thread DB Tsai
Congratulations on the great work!

Sincerely,

DB Tsai
--
Web: https://www.dbtsai.com
PGP Key ID: 42E5B25A8F7A82C1

On Sat, Aug 24, 2019 at 8:11 AM Dongjoon Hyun  wrote:
>
> Hi, All.
>
> Thanks to your many many contributions,
> Apache Spark master branch starts to pass on JDK11 as of today.
> (with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)
>
> 
> https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-3.2-jdk-11/326/
> (JDK11 is used for building and testing.)
>
> We already verified all UTs (including PySpark/SparkR) before.
>
> Please feel free to use JDK11 in order to build/test/run `master` branch and
> share your experience including any issues. It will help Apache Spark 3.0.0 
> release.
>
> For the follow-ups, please follow 
> https://issues.apache.org/jira/browse/SPARK-24417 .
> The next step is `how to support JDK8/JDK11 together in a single artifact`.
>
> Bests,
> Dongjoon.

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org