Good news - and Java 8 as well. I saw Matei after his talk at Scala days
and he said he would look into a 2.11 default but it seems that is already
the plan. Scala 2.12 is getting closer as well.
On Mon, May 16, 2016 at 2:55 PM, Ted Yu wrote:
> For 2.0, I believe that is
For 2.0, I believe that is the case.
Jenkins jobs have been running against Scala 2.11:
[INFO] --- scala-maven-plugin:3.2.2:testCompile
(scala-test-compile-first) @ java8-tests_2.11 ---
FYI
On Mon, May 16, 2016 at 2:45 PM, Eric Richardson
wrote:
> On Thu, May 12,
On Thu, May 12, 2016 at 9:23 PM, Luciano Resende
wrote:
> Spark has moved to build using Scala 2.11 by default in master/trunk.
>
Does this mean that the pre-built binaries for download will also move to
2.11 as well?
>
>
> As for the 2.0.0-SNAPSHOT, it is actually the
Thank you for the response.
I used the following command to build from source
build/mvn -Dhadoop.version=2.6.4 -Phadoop-2.6 -DskipTests clean package
Would this put in the required jars in .ivy2 during the build process? If
so, how can I make the spark distribution runnable, so that I can use
Spark has moved to build using Scala 2.11 by default in master/trunk.
As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk and
you might be missing some modules/profiles for your build. What command did
you use to build ?
On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju <
Hello All,
I built Spark from the source code available at
https://github.com/apache/spark/. Although I haven't specified the
"-Dscala-2.11" option (to build with Scala 2.11), from the build messages I
see that it ended up using Scala 2.11. Now, for my application sbt, what
should be the spark