Oh, right. If you want to publish something to Maven, it will inherit the
situation.
Thank you for feedback. :)

On Mon, Aug 26, 2019 at 10:37 AM Michael Heuer <heue...@gmail.com> wrote:

> That is not true for any downstream users who also provide a library.
> Whatever build mess you create in Apache Spark, we'll have to inherit it.
>  ;)
>
>    michael
>
>
> On Aug 26, 2019, at 12:32 PM, Dongjoon Hyun <dongjoon.h...@gmail.com>
> wrote:
>
> As Shane wrote, not yet.
>
> `one build for works for both` is our aspiration and the next step
> mentioned in the first email.
>
> > The next step is `how to support JDK8/JDK11 together in a single
> artifact`.
>
> For the downstream users who build from the Apache Spark source, that will
> not be a blocker because they will prefer a single JDK.
>
> Bests,
> Dongjoon.
>
> On Mon, Aug 26, 2019 at 10:28 AM Shane Knapp <skn...@berkeley.edu> wrote:
>
>> maybe in the future, but not right now as the hadoop 2.7 build is broken.
>>
>> also, i busted dev/run-tests.py in my changes to support java11 in PRBs:
>> https://github.com/apache/spark/pull/25585
>>
>> quick fix, testing now.
>>
>> On Mon, Aug 26, 2019 at 10:23 AM Reynold Xin <r...@databricks.com> wrote:
>>
>>> Would it be possible to have one build that works for both?
>>>
>>>
>>
>

Reply via email to