My concern is that for some of those stuck using 2.10 because of some
library dependency, three months isn't sufficient time to refactor their
infrastructure to be compatible with Spark 2.0.0 if that requires Scala
2.11.  The additional 3-6 months would make it much more feasible for those
users to stay onboard for 2.0.x using the deprecated 2.10 support,
migrating to 2.11 by the release of Spark 2.1.0.

On Wed, Mar 30, 2016 at 9:01 AM, Sean Owen <so...@cloudera.com> wrote:

> Yeah it is not crazy to drop support for something foundational like this
> in a feature release but is something ideally coupled to a major release.
> You could at least say it is probably a decision to keep supporting through
> the end of the year given how releases are likely to go. Given the
> availability of the 'right' time to do it in the near future, does the
> value of passing that up to extend support for 3-6 more months outweigh the
> negatives ? I guess I think 2.10 is already about as droppable as it will
> get so it doesn't buy much. It is an option.
>
>
> On Wed, Mar 30, 2016, 07:44 Cody Koeninger <c...@koeninger.org> wrote:
>
>> I agree with Mark in that I don't see how supporting scala 2.10 for
>> spark 2.0 implies supporting it for all of spark 2.x
>>
>> Regarding Koert's comment on akka, I thought all akka dependencies
>> have been removed from spark after SPARK-7997 and the recent removal
>> of external/akka
>>
>> On Wed, Mar 30, 2016 at 9:36 AM, Mark Hamstra <m...@clearstorydata.com>
>> wrote:
>> > Dropping Scala 2.10 support has to happen at some point, so I'm not
>> > fundamentally opposed to the idea; but I've got questions about how we
>> go
>> > about making the change and what degree of negative consequences we are
>> > willing to accept.  Until now, we have been saying that 2.10 support
>> will be
>> > continued in Spark 2.0.0.  Switching to 2.11 will be non-trivial for
>> some
>> > Spark users, so abruptly dropping 2.10 support is very likely to delay
>> > migration to Spark 2.0 for those users.
>> >
>> > What about continuing 2.10 support in 2.0.x, but repeatedly making an
>> > obvious announcement in multiple places that such support is deprecated,
>> > that we are not committed to maintaining it throughout 2.x, and that it
>> is,
>> > in fact, scheduled to be removed in 2.1.0?
>> >
>> > On Wed, Mar 30, 2016 at 7:45 AM, Sean Owen <so...@cloudera.com> wrote:
>> >>
>> >> (This should fork as its own thread, though it began during discussion
>> >> of whether to continue Java 7 support in Spark 2.x.)
>> >>
>> >> Simply: would like to more clearly take the temperature of all
>> >> interested parties about whether to support Scala 2.10 in the Spark
>> >> 2.x lifecycle. Some of the arguments appear to be:
>> >>
>> >> Pro
>> >> - Some third party dependencies do not support Scala 2.11+ yet and so
>> >> would not be usable in a Spark app
>> >>
>> >> Con
>> >> - Lower maintenance overhead -- no separate 2.10 build,
>> >> cross-building, tests to check, esp considering support of 2.12 will
>> >> be needed
>> >> - Can use 2.11+ features freely
>> >> - 2.10 was EOL in late 2014 and Spark 2.x lifecycle is years to come
>> >>
>> >> I would like to not support 2.10 for Spark 2.x, myself.
>> >>
>> >> ---------------------------------------------------------------------
>> >> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> >> For additional commands, e-mail: dev-h...@spark.apache.org
>> >>
>> >
>>
>

Reply via email to