Its already underway: https://github.com/apache/spark/pull/10608

On Fri, Jan 29, 2016 at 11:50 AM, Jakob Odersky <ja...@odersky.com> wrote:

> I'm not an authoritative source but I think it is indeed the plan to
> move the default build to 2.11.
>
> See this discussion for more detail
>
> http://apache-spark-developers-list.1001551.n3.nabble.com/A-proposal-for-Spark-2-0-td15122.html
>
> On Fri, Jan 29, 2016 at 11:43 AM, Deenar Toraskar
> <deenar.toras...@gmail.com> wrote:
> > A related question. Are the plans to move the default Spark builds to
> Scala
> > 2.11 with Spark 2.0?
> >
> > Regards
> > Deenar
> >
> > On 27 January 2016 at 19:55, Michael Armbrust <mich...@databricks.com>
> > wrote:
> >>
> >> We do maintenance releases on demand when there is enough to justify
> doing
> >> one.  I'm hoping to cut 1.6.1 soon, but have not had time yet.
> >>
> >> On Wed, Jan 27, 2016 at 8:12 AM, Daniel Siegmann
> >> <daniel.siegm...@teamaol.com> wrote:
> >>>
> >>> Will there continue to be monthly releases on the 1.6.x branch during
> the
> >>> additional time for bug fixes and such?
> >>>
> >>> On Tue, Jan 26, 2016 at 11:28 PM, Koert Kuipers <ko...@tresata.com>
> >>> wrote:
> >>>>
> >>>> thanks thats all i needed
> >>>>
> >>>> On Tue, Jan 26, 2016 at 6:19 PM, Sean Owen <so...@cloudera.com>
> wrote:
> >>>>>
> >>>>> I think it will come significantly later -- or else we'd be at code
> >>>>> freeze for 2.x in a few days. I haven't heard anyone discuss this
> >>>>> officially but had batted around May or so instead informally in
> >>>>> conversation. Does anyone have a particularly strong opinion on that?
> >>>>> That's basically an extra 3 month period.
> >>>>>
> >>>>> https://cwiki.apache.org/confluence/display/SPARK/Wiki+Homepage
> >>>>>
> >>>>> On Tue, Jan 26, 2016 at 10:00 PM, Koert Kuipers <ko...@tresata.com>
> >>>>> wrote:
> >>>>> > Is the idea that spark 2.0 comes out roughly 3 months after 1.6? So
> >>>>> > quarterly release as usual?
> >>>>> > Thanks
> >>>>
> >>>>
> >>>
> >>
> >
>

Reply via email to