Do we need to publish the Scala 2.12 + hadoop 3.2 jar packages to the Maven
repository? Otherwise it will throw a NoSuchMethodError on Java 11.
Here is an example:
https://github.com/wangyum/test-spark-jdk11/blob/master/src/test/scala/test/spark/HiveTableSuite.scala#L34-L38
https://github.com/wangyum/test-spark-jdk11/commit/927ce7d3766881fba98f2434055fa3a1d1544ad2/checks?check_suite_id=283076578


On Sat, Oct 26, 2019 at 10:41 AM Takeshi Yamamuro <linguin....@gmail.com>
wrote:

> Thanks for that work!
>
> > I don't think JDK 11 is a separate release (by design). We build
> > everything targeting JDK 8 and it should work on JDK 11 too.
> +1. a single package working on both jvms looks nice.
>
>
> On Sat, Oct 26, 2019 at 4:18 AM Sean Owen <sro...@gmail.com> wrote:
>
>> I don't think JDK 11 is a separate release (by design). We build
>> everything targeting JDK 8 and it should work on JDK 11 too.
>>
>> So, just two releases, but, frankly I think we soon need to stop
>> multiple releases for multiple Hadoop versions, and stick to Hadoop 3.
>> I think it's fine to try to release for Hadoop 2 as the support still
>> exists, and because the difference happens to be larger due to the
>> different Hive dependency.
>>
>> On Fri, Oct 25, 2019 at 2:08 PM Xingbo Jiang <jiangxb1...@gmail.com>
>> wrote:
>> >
>> > Hi all,
>> >
>> > I would like to bring out a discussion on how many packages shall be
>> released in 3.0.0-preview, the ones I can think of now:
>> >
>> > * scala 2.12 + hadoop 2.7
>> > * scala 2.12 + hadoop 3.2
>> > * scala 2.12 + hadoop 3.2 + JDK 11
>> >
>> > Do you have other combinations to add to the above list?
>> >
>> > Cheers,
>> >
>> > Xingbo
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>
>
> --
> ---
> Takeshi Yamamuro
>

Reply via email to