I think one of the key problems here are the required dependency
upgrades. It would mean many minor breaking changes and a few bigger
ones, notably around Hive, and forces a scala 2.12-only update. I
think my question is whether that even makes sense as a minor release?
it wouldn't be backwards compatible with 2.4 enough to call it a
low-risk update. It would be a smaller step than moving all the way to
3.0, sure. I am not super against it, but we have to keep in mind how
much work it would then be to maintain two LTS 2.x releases, 2.4 and
the sort-of-compatible 2.5, while proceeding with 3.x.

On Tue, Aug 27, 2019 at 2:01 PM DB Tsai <d_t...@apple.com.invalid> wrote:
>
> Hello everyone,
>
> Thank you all for working on supporting JDK11 in Apache Spark 3.0 as a 
> community.
>
> Java 8 is already end of life for commercial users, and many companies are 
> moving to Java 11.
> The release date for Apache Spark 3.0 is still not there yet, and there are 
> many API
> incompatibility issues when upgrading from Spark 2.x. As a result, asking 
> users to move to
> Spark 3.0 to use JDK 11 is not realistic.
>
> Should we backport PRs for JDK11 and cut a release in 2.x to support JDK11?
>
> Should we cut a new Apache Spark 2.5 since the patches involve some of the 
> dependencies changes
> which is not desired in minor release?
>
> Thanks.
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, 
> Inc
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to