I did the following:
 1655  dev/change-version-to-2.11.sh
 1657  mvn -DHADOOP_PROFILE=hadoop-2.4 -Pyarn,hive -Phadoop-2.4
-Dscala-2.11 -DskipTests clean package

And mvn command passed.

Did you see any cross-compilation errors ?

Cheers

BTW the two links you mentioned are consistent in terms of building for
Scala 2.11

On Sat, Jan 17, 2015 at 3:43 PM, Walrus theCat <walrusthe...@gmail.com>
wrote:

> Hi,
>
> When I run this:
>
> dev/change-version-to-2.11.sh
> mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package
>
> as per here
> <https://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211>,
> maven doesn't build Spark's dependencies.
>
> Only when I run:
>
> dev/change-version-to-2.11.sh
> sbt/sbt -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests  clean package
>
> as gathered from here 
> <https://github.com/ScrapCodes/spark-1/blob/patch-3/docs/building-spark.md>, 
> do I get Spark's dependencies built without any cross-compilation errors.
>
> *Question*:
>
> - How can I make maven do this?
>
> - How can I specify the use of Scala 2.11 in my own .pom files?
>
> Thanks
>
>

Reply via email to