Yes they do. We haven't dropped 2.10 support yet. There are too many 2.10
active deployments out there.


On Mon, Feb 1, 2016 at 11:33 AM, Jakob Odersky <ja...@odersky.com> wrote:

> Awesome!
> +1 on Steve Loughran's question, how does this affect support for
> 2.10? Do future contributions need to work with Scala 2.10?
>
> cheers
>
> On Mon, Feb 1, 2016 at 7:02 AM, Ted Yu <yuzhih...@gmail.com> wrote:
> > The following jobs have been established for build against Scala 2.10:
> >
> >
> https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Compile/job/SPARK-master-COMPILE-MAVEN-SCALA-2.10/
> >
> https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Compile/job/SPARK-master-COMPILE-sbt-SCALA-2.10/
> >
> > FYI
> >
> > On Mon, Feb 1, 2016 at 4:22 AM, Steve Loughran <ste...@hortonworks.com>
> > wrote:
> >>
> >>
> >> On 30 Jan 2016, at 08:22, Reynold Xin <r...@databricks.com> wrote:
> >>
> >> FYI - I just merged Josh's pull request to switch to Scala 2.11 as the
> >> default build.
> >>
> >> https://github.com/apache/spark/pull/10608
> >>
> >>
> >>
> >> does this mean that Spark 2.10 compatibility & testing are no longer
> >> needed?
> >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to