Hello everyone,

Thank you all for working on supporting JDK11 in Apache Spark 3.0 as a 
community.

Java 8 is already end of life for commercial users, and many companies are 
moving to Java 11. 
The release date for Apache Spark 3.0 is still not there yet, and there are 
many API 
incompatibility issues when upgrading from Spark 2.x. As a result, asking users 
to move to
Spark 3.0 to use JDK 11 is not realistic.

Should we backport PRs for JDK11 and cut a release in 2.x to support JDK11?

Should we cut a new Apache Spark 2.5 since the patches involve some of the 
dependencies changes
which is not desired in minor release?

Thanks.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc


---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to