Hi Folks,

As we're getting closer to Spark 3 I'd like to revisit a Spark 2.5 release.
Spark 3 brings a number of important changes, and by its nature is not
backward compatible. I think we'd all like to have as smooth an upgrade
experience to Spark 3 as possible, and I believe that having a Spark 2
release some of the new functionality while continuing to support the older
APIs and current Scala version would make the upgrade path smoother.

This pattern is not uncommon in other Hadoop ecosystem projects, like
Hadoop itself and HBase.

I know that Ryan Blue has indicated he is already going to be maintaining
something like that internally at Netflix, and we'll be doing the same
thing at Apple. It seems like having a transitional release could benefit
the community with easy migrations and help avoid duplicated work.

I want to be clear I'm volunteering to do the work of managing a 2.5
release, so hopefully, this wouldn't create any substantial burdens on the
community.

Cheers,

Holden
-- 
Twitter: https://twitter.com/holdenkarau
Books (Learning Spark, High Performance Spark, etc.):
https://amzn.to/2MaRAG9  <https://amzn.to/2MaRAG9>
YouTube Live Streams: https://www.youtube.com/user/holdenkarau

Reply via email to