To run through all the tests you'd need to create the assembly jar first.

I've seen this asked a few times. Maybe we should make it more obvious.



http://spark.apache.org/docs/latest/building-with-maven.html

Spark Tests in Maven

Tests are run by default via the ScalaTest Maven plugin
<http://www.scalatest.org/user_guide/using_the_scalatest_maven_plugin>.
Some of the require Spark to be packaged first, so always run mvn package
 with -DskipTests the first time. You can then run the tests with mvn
-Dhadoop.version=... test.

The ScalaTest plugin also supports running only a specific test suite as
follows:

mvn -Dhadoop.version=... -DwildcardSuites=org.apache.spark.repl.ReplSuite test





On Sun, Jul 27, 2014 at 7:07 PM, Stephen Boesch <java...@gmail.com> wrote:

> I have pulled latest from github this afternoon.   There are many many
> errors:
>
> <source_home>/assembly/target/scala-2.10: No such file or directory
>
> This causes many tests to fail.
>
> Here is the command line I am running
>
>     mvn -Pyarn -Phadoop-2.3 -Phive package test
>

Reply via email to