I agree that putting it in 2.0 doesn't mean keeping Scala 2.10 for the entire 
2.x line. My vote is to keep Scala 2.10 in Spark 2.0, because it's the default 
version we built with in 1.x. We want to make the transition from 1.x to 2.0 as 
easy as possible. In 2.0, we'll have the default downloads be for Scala 2.11, 
so people will more easily move, but we shouldn't create obstacles that lead to 
fragmenting the community and slowing down Spark 2.0's adoption. I've seen 
companies that stayed on an old Scala version for multiple years because 
switching it, or mixing versions, would affect the company's entire codebase.

Matei

> On Mar 30, 2016, at 12:08 PM, Koert Kuipers <ko...@tresata.com> wrote:
> 
> oh wow, had no idea it got ripped out
> 
> On Wed, Mar 30, 2016 at 11:50 AM, Mark Hamstra <m...@clearstorydata.com 
> <mailto:m...@clearstorydata.com>> wrote:
> No, with 2.0 Spark really doesn't use Akka: 
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkConf.scala#L744
>  
> <https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkConf.scala#L744>
> 
> On Wed, Mar 30, 2016 at 9:10 AM, Koert Kuipers <ko...@tresata.com 
> <mailto:ko...@tresata.com>> wrote:
> Spark still runs on akka. So if you want the benefits of the latest akka (not 
> saying we do, was just an example) then you need to drop scala 2.10
> 
> On Mar 30, 2016 10:44 AM, "Cody Koeninger" <c...@koeninger.org 
> <mailto:c...@koeninger.org>> wrote:
> I agree with Mark in that I don't see how supporting scala 2.10 for
> spark 2.0 implies supporting it for all of spark 2.x
> 
> Regarding Koert's comment on akka, I thought all akka dependencies
> have been removed from spark after SPARK-7997 and the recent removal
> of external/akka
> 
> On Wed, Mar 30, 2016 at 9:36 AM, Mark Hamstra <m...@clearstorydata.com 
> <mailto:m...@clearstorydata.com>> wrote:
> > Dropping Scala 2.10 support has to happen at some point, so I'm not
> > fundamentally opposed to the idea; but I've got questions about how we go
> > about making the change and what degree of negative consequences we are
> > willing to accept.  Until now, we have been saying that 2.10 support will be
> > continued in Spark 2.0.0.  Switching to 2.11 will be non-trivial for some
> > Spark users, so abruptly dropping 2.10 support is very likely to delay
> > migration to Spark 2.0 for those users.
> >
> > What about continuing 2.10 support in 2.0.x, but repeatedly making an
> > obvious announcement in multiple places that such support is deprecated,
> > that we are not committed to maintaining it throughout 2.x, and that it is,
> > in fact, scheduled to be removed in 2.1.0?
> >
> > On Wed, Mar 30, 2016 at 7:45 AM, Sean Owen <so...@cloudera.com 
> > <mailto:so...@cloudera.com>> wrote:
> >>
> >> (This should fork as its own thread, though it began during discussion
> >> of whether to continue Java 7 support in Spark 2.x.)
> >>
> >> Simply: would like to more clearly take the temperature of all
> >> interested parties about whether to support Scala 2.10 in the Spark
> >> 2.x lifecycle. Some of the arguments appear to be:
> >>
> >> Pro
> >> - Some third party dependencies do not support Scala 2.11+ yet and so
> >> would not be usable in a Spark app
> >>
> >> Con
> >> - Lower maintenance overhead -- no separate 2.10 build,
> >> cross-building, tests to check, esp considering support of 2.12 will
> >> be needed
> >> - Can use 2.11+ features freely
> >> - 2.10 was EOL in late 2014 and Spark 2.x lifecycle is years to come
> >>
> >> I would like to not support 2.10 for Spark 2.x, myself.
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org 
> >> <mailto:dev-unsubscr...@spark.apache.org>
> >> For additional commands, e-mail: dev-h...@spark.apache.org 
> >> <mailto:dev-h...@spark.apache.org>
> >>
> >
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org 
> <mailto:dev-unsubscr...@spark.apache.org>
> For additional commands, e-mail: dev-h...@spark.apache.org 
> <mailto:dev-h...@spark.apache.org>
> 
> 
> 

Reply via email to