I don't agree with this take. The bottleneck is pretty much not Spark
-- it is all of its dependencies, and there are unfortunately a lot.
For example, Chill (among other things) doesn't support 2.13 yet. I
don't think 2.13 is that 'mainstream' yet. We are not close to Scala
2.13 support, so it won't be in 3.0, but, I can tell you I've already
made almost all of the Spark-side changes for it.

Keep in mind too that using Scala 2.13 for users means that all of
_their_ dependencies have to support 2.13.

Please instead look at the JIRAs for 2.13 support and encourage
dependencies to update.
Or, frankly, maybe Scala should reconsider the mutual incompatibility
between minor releases. These are basically major releases, and
indeed, it causes exactly this kind of headache.

On Wed, Oct 30, 2019 at 5:36 PM antonkulaga <antonkul...@gmail.com> wrote:
>
> Why not trying the current Scala (2.13)? Spark has always been one (sometimes
> - two) Scala versions away from the whole Scala ecosystem and it has always
> been a big pain point for everybody. I understand that in the past you could
> not switch because of compatibility issues, but 3.x is a major version
> update and you can break things, maybe you can finally consider to use the
> current Scala?
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to