Exactly - I think it's important to be able to create a single binary build. 
Otherwise downstream users (the 99.99% won't be building their own Spark but 
just pull it from Maven) will have to deal with the mess, and it's even worse 
for libraries.

On Mon, Aug 26, 2019 at 10:51 AM, Dongjoon Hyun < dongjoon.h...@gmail.com > 
wrote:

> 
> Oh, right. If you want to publish something to Maven, it will inherit the
> situation.
> Thank you for feedback. :)
> 
> On Mon, Aug 26, 2019 at 10:37 AM Michael Heuer < heuermh@ gmail. com (
> heue...@gmail.com ) > wrote:
> 
> 
>> That is not true for any downstream users who also provide a library. 
>> Whatever build mess you create in Apache Spark, we'll have to inherit it. 
>> ;)
>> 
>> 
>>    michael
>> 
>> 
>> 
>> 
>>> On Aug 26, 2019, at 12:32 PM, Dongjoon Hyun < dongjoon. hyun@ gmail. com (
>>> dongjoon.h...@gmail.com ) > wrote:
>>> 
>>> As Shane wrote, not yet.
>>> 
>>> 
>>> `one build for works for both` is our aspiration and the next step
>>> mentioned in the first email.
>>> 
>>> 
>>> 
>>> > The next step is `how to support JDK8/JDK11 together in a single
>>> artifact`.
>>> 
>>> 
>>> For the downstream users who build from the Apache Spark source, that will
>>> not be a blocker because they will prefer a single JDK.
>>> 
>>> 
>>> 
>>> Bests,
>>> Dongjoon.
>>> 
>>> On Mon, Aug 26, 2019 at 10:28 AM Shane Knapp < sknapp@ berkeley. edu (
>>> skn...@berkeley.edu ) > wrote:
>>> 
>>> 
>>>> maybe in the future, but not right now as the hadoop 2.7 build is broken.
>>>> 
>>>> 
>>>> also, i busted dev/ run-tests. py ( http://dev/run-tests.py ) in my changes
>>>> to support java11 in PRBs:
>>>> https:/ / github. com/ apache/ spark/ pull/ 25585 (
>>>> https://github.com/apache/spark/pull/25585 )
>>>> 
>>>> 
>>>> 
>>>> quick fix, testing now.
>>>> 
>>>> On Mon, Aug 26, 2019 at 10:23 AM Reynold Xin < rxin@ databricks. com (
>>>> r...@databricks.com ) > wrote:
>>>> 
>>>> 
>>>>> Would it be possible to have one build that works for both?
>>>>> 
>>>> 
>>>> 
>>> 
>>> 
>> 
>> 
> 
>

Reply via email to