​about that pro, i think it's more the opposite: ​many libraries have
stopped maintaining scala 2.10 versions. bugs will no longer be fixed for
scala 2.10 and new libraries will not be available for scala 2.10 at all,
making  them unusable in spark.

take for example akka, a distributed messaging library spark is build on.
the newest version does not support scala 2.10 today.

also since the intention is to support scala 2.12 at some point as well for
spark 2, the burden of supporting 3 scala versions would be significant.

On Wed, Mar 30, 2016 at 9:45 AM, Sean Owen <so...@cloudera.com> wrote:

> (This should fork as its own thread, though it began during discussion
> of whether to continue Java 7 support in Spark 2.x.)
>
> Simply: would like to more clearly take the temperature of all
> interested parties about whether to support Scala 2.10 in the Spark
> 2.x lifecycle. Some of the arguments appear to be:
>
> Pro
> - Some third party dependencies do not support Scala 2.11+ yet and so
> would not be usable in a Spark app
>
> Con
> - Lower maintenance overhead -- no separate 2.10 build,
> cross-building, tests to check, esp considering support of 2.12 will
> be needed
> - Can use 2.11+ features freely
> - 2.10 was EOL in late 2014 and Spark 2.x lifecycle is years to come
>
> I would like to not support 2.10 for Spark 2.x, myself.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to