https://github.com/apache/incubator-celeborn/pull/2387

On Tue, Mar 12, 2024 at 12:02 PM Curtis Howard <
curtis.james.how...@gmail.com> wrote:

> Sure thing, I will send a PR.
>
> Curtis
>
> On Tue, Mar 12, 2024, 11:32 AM Keyong Zhou <zho...@apache.org> wrote:
>
>> I think so. Do you mind sending a PR to Celeborn? :)
>>
>> Regards,
>> Keyong Zhou
>>
>> Curtis Howard <curtis.james.how...@gmail.com>于2024年3月12日 周二22:58写道:
>>
>>> Thanks Keyong,
>>>
>>> I was able to build the 'client' JARs for Spark 3.2 / JDK21, and run
>>> simple tests; however I did face a runtime error similar to this:
>>>
>>> Caused by: java.lang.ExceptionInInitializerError: Exception
>>> java.lang.IllegalStateException: java.lang.NoSuchMethodException:
>>> java.nio.DirectByteBuffer.<init>(long, int) [in thread "Executor task
>>> launch worker for task 0.0 in stage 0.0 (TID 0)"]
>>>
>>>         at
>>> org.apache.celeborn.common.unsafe.Platform.<clinit>(Platform.java:135)
>>>
>>>         ... 16 more
>>>
>>> I used the following patch (borrowed from the almost identical upstream
>>> Spark project patch for SPARK-42369
>>> <https://github.com/apache/spark/pull/39909> that was required for
>>> JDK21, as a result of JDK-8303083
>>> <https://bugs.openjdk.org/browse/JDK-8303083>), to work around this
>>> successfully (I think it may be required in Celeborn Platform.java as well,
>>> for JDK21?):
>>>
>>> diff --git
>>> a/common/src/main/java/org/apache/celeborn/common/unsafe/Platform.java
>>> b/common/src/main/java/org/apache/celeborn/common/unsafe/Platform.java
>>>
>>> index ec541a77..218d517b 100644
>>>
>>> ---
>>> a/common/src/main/java/org/apache/celeborn/common/unsafe/Platform.java
>>>
>>> +++
>>> b/common/src/main/java/org/apache/celeborn/common/unsafe/Platform.java
>>>
>>> @ -90,7 +90,15 @@ public final class Platform {
>>>
>>>      }
>>>
>>>      try {
>>>
>>>        Class<?> cls = Class.forName("java.nio.DirectByteBuffer");
>>>
>>> -      Constructor<?> constructor =
>>> cls.getDeclaredConstructor(Long.TYPE, Integer.TYPE);
>>>
>>> +      Constructor<?> constructor;
>>>
>>> +      try {
>>>
>>> +        constructor = cls.getDeclaredConstructor(Long.TYPE,
>>> Integer.TYPE);
>>>
>>> +      } catch (NoSuchMethodException e) {
>>>
>>> +        // DirectByteBuffer(long,int) was removed in
>>>
>>> +        //
>>> https://github.com/openjdk/jdk/commit/a56598f5a534cc9223367e7faa8433ea38661db9
>>>
>>> +        constructor = cls.getDeclaredConstructor(Long.TYPE, Long.TYPE);
>>>
>>> +      }
>>>
>>> +
>>>
>>>       Field cleanerField = cls.getDeclaredField("cleaner");
>>>
>>>       try {
>>>
>>>         constructor.setAccessible(true);
>>>
>>>
>>>
>>>
>>> Curtis
>>>
>>>
>>>
>>> On Tue, Mar 12, 2024 at 10:32 AM Keyong Zhou <zho...@apache.org> wrote:
>>>
>>>> Hi Curtis,
>>>>
>>>> With this PR https://github.com/apache/incubator-celeborn/pull/2385
>>>> you can compile with JDK21 using following commands:
>>>>  ./build/make-distribution.sh -Pspark-3.5 -Pjdk-21
>>>>
>>>> Regards,
>>>> Keyong Zhou
>>>>
>>>> Curtis Howard <curtis.james.how...@gmail.com> 于2024年3月9日周六 02:38写道:
>>>>
>>>>> Thank you Keyong!
>>>>>
>>>>> Related to this, has testing started for Celeborn with JDK21?  (any
>>>>> anticipated concerns there, based on what you know so far?).
>>>>> We will be migrating to JDK21 shortly, which is why I ask.
>>>>>
>>>>> Thanks again
>>>>> Curtis
>>>>>
>>>>> On Fri, Mar 8, 2024 at 11:05 AM Keyong Zhou <zho...@apache.org> wrote:
>>>>>
>>>>>> Hi Curtis,
>>>>>>
>>>>>> Thanks for reaching out!
>>>>>>
>>>>>> No there is no known blockers for Celeborn + Spark 3.2 + JDK17, and I
>>>>>> think there is a good chance that it could be used successfully.
>>>>>>
>>>>>> Any problems with your test, feel free to let us know :)
>>>>>>
>>>>>> Regards,
>>>>>> Keyong Zhou
>>>>>>
>>>>>> Curtis Howard <curtis.james.how...@gmail.com> 于2024年3月8日周五 22:23写道:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> We would like to confirm the reason for Celeborn not being listed
>>>>>>> with Spark 3.2 and JDK17, as shown in the compatibility matrix here:
>>>>>>> https://github.com/apache/incubator-celeborn?tab=readme-ov-file#build
>>>>>>>
>>>>>>> Is the reason for this only because the Apache Spark 3.2 release
>>>>>>> does not officially support JDK17? (as covered in
>>>>>>> https://issues.apache.org/jira/browse/SPARK-33772), or have
>>>>>>> another Celeborn-specific conflicts been found, with the Spark 3.2 + 
>>>>>>> JDK17
>>>>>>> version combination?
>>>>>>>
>>>>>>> We currently build Spark ourselves with custom dependencies, and are
>>>>>>> successfully using Spark 3.2 with JDK17.  Understanding that this
>>>>>>> combination has likely not been tested with Celeborn, we are wondering 
>>>>>>> if
>>>>>>> there are any known blockers for Celeborn + Spark 3.2 + JDK17, or if 
>>>>>>> there
>>>>>>> is a good chance that it could still be used successfully.
>>>>>>>
>>>>>>> Thank you!
>>>>>>> Curtis
>>>>>>>
>>>>>>

Reply via email to