Most of the 2.11 issues are being resolved in Spark 1.4. For a while, the
Spark project has published maven artifacts that are compiled with 2.11 and
2.10, although the downloads at http://spark.apache.org/downloads.html are
still all for 2.10.

Dean Wampler, Ph.D.
Author: Programming Scala, 2nd Edition
<http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
Typesafe <http://typesafe.com>
@deanwampler <http://twitter.com/deanwampler>
http://polyglotprogramming.com

On Tue, May 26, 2015 at 10:33 AM, Ritesh Kumar Singh <
[email protected]> wrote:

> Yes, recommended version is 2.10 as all the features are not supported by
> 2.11 version. Kafka libraries and JDBC components are yet to be ported to
> 2.11 version. And so if your project doesn't depend on these components,
> you can give v2.11 a try.
>
> Here's a link
> <https://spark.apache.org/docs/1.2.0/building-spark.html#building-for-scala-211>
>  for
> building with 2.11 version.
>
> Though, you won't be running into any issues if you try v2.10 as of now.
> But then again, the future releases will have to shift to 2.11 version once
> support for v2.10 ends in the long run.
>
>
> On Tue, May 26, 2015 at 8:21 PM, Punyashloka Biswal <
> [email protected]> wrote:
>
>> Dear Spark developers and users,
>>
>> Am I correct in believing that the recommended version of Scala to use
>> with Spark is currently 2.10? Is there any plan to switch to 2.11 in
>> future? Are there any advantages to using 2.11 today?
>>
>> Regards,
>> Punya
>
>
>

Reply via email to