Dropping Scala 2.10 support has to happen at some point, so I'm not
fundamentally opposed to the idea; but I've got questions about how we go
about making the change and what degree of negative consequences we are
willing to accept.  Until now, we have been saying that 2.10 support will
be continued in Spark 2.0.0.  Switching to 2.11 will be non-trivial for
some Spark users, so abruptly dropping 2.10 support is very likely to delay
migration to Spark 2.0 for those users.

What about continuing 2.10 support in 2.0.x, but repeatedly making an
obvious announcement in multiple places that such support is deprecated,
that we are not committed to maintaining it throughout 2.x, and that it is,
in fact, scheduled to be removed in 2.1.0?

On Wed, Mar 30, 2016 at 7:45 AM, Sean Owen <so...@cloudera.com> wrote:

> (This should fork as its own thread, though it began during discussion
> of whether to continue Java 7 support in Spark 2.x.)
>
> Simply: would like to more clearly take the temperature of all
> interested parties about whether to support Scala 2.10 in the Spark
> 2.x lifecycle. Some of the arguments appear to be:
>
> Pro
> - Some third party dependencies do not support Scala 2.11+ yet and so
> would not be usable in a Spark app
>
> Con
> - Lower maintenance overhead -- no separate 2.10 build,
> cross-building, tests to check, esp considering support of 2.12 will
> be needed
> - Can use 2.11+ features freely
> - 2.10 was EOL in late 2014 and Spark 2.x lifecycle is years to come
>
> I would like to not support 2.10 for Spark 2.x, myself.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to