For #1, the published jars are usable.
However, you should build from source for your specific combination of
profiles.

Cheers

On Wed, Nov 11, 2015 at 3:22 PM, shajra-cogscale <sha...@cognitivescale.com>
wrote:

> Hi,
>
> My company isn't using Spark in production yet, but we are using a bit of
> Scala.  There's a few people who have wanted to be conservative and keep
> our
> Scala at 2.10 in the event we start using Spark.  There are others who want
> to move to 2.11 with the idea that by the time we're using Spark it will be
> more or less 2.11-ready.
>
> It's hard to make a strong judgement on these kinds of things without
> getting some community feedback.
>
> Looking through the internet I saw:
>
> 1) There's advice to build 2.11 packages from source -- but also published
> jars to Maven Central for 2.11.  Are these jars on Maven Central usable and
> the advice to build from source outdated?
>
> 2)  There's a note that the JDBC RDD isn't 2.11-compliant.  This is okay
> for
> us, but is there anything else to worry about?
>
> It would be nice to get some answers to those questions as well as any
> other
> feedback from maintainers or anyone that's used Spark with Scala 2.11
> beyond
> simple examples.
>
> Thanks,
> Sukant
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Status-of-2-11-support-tp25362.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to