Brane,

Spark already has limited Scala 2.11 support.
http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211
Spark has problems with third party libs: "Scala 2.11 support in Spark does
not support a few features due to dependencies which are themselves not
Scala 2.11 ready. Specifically, Spark’s external Kafka library and JDBC
component are not yet supported in Scala 2.11 builds."

On Tue, Jun 2, 2015 at 4:52 PM, Branko Čibej <br...@apache.org> wrote:

> On 01.06.2015 18:06, Dmitriy Setrakyan wrote:
> > On Mon, Jun 1, 2015 at 9:01 AM, Nikita Ivanov <nivano...@gmail.com>
> wrote:
> >
> >> Generally, I'd vote for 2.11 not because of any language features but
> >> because of stability (and tooling around). 2.11 is very stable
> (comparing
> >> to 2.10).
> >>
> > If we go with 2.11, how do we make it work with Spark which runs on Scala
> > 2.10?
>
> Teach Spark to use Scala 2.11, of course.
>
> I find it inconceivable that minor Scala versions are incompatible to
> this extent. Looks like a major fail to me ... but not that uncommon,
> notice that Ruby has the same lack of backward compatibility.
>
> -- Brane
>



-- 
Alexey Kuznetsov
GridGain Systems
www.gridgain.com

Reply via email to