i Reynold,
  thanks for responding here. Yes I had looked at the building with maven
page in the past.  I have not noticed  that the "package" step must happen
*before *the test.  I had assumed it were a corequisite -as seen in my
command line.

So the following sequence appears to work fine (so far so good - well past
when the prior attempts failed):


 mvn -Pyarn -Phadoop-2.3 -DskipTests -Phive clean package
mvn -Pyarn -Phadoop-2.3 -Phive test

AFA documentation,  yes adding another sentence to that same "Building with
Maven" page would likely be helpful to future generations.


2014-07-27 19:10 GMT-07:00 Reynold Xin <r...@databricks.com>:

> To run through all the tests you'd need to create the assembly jar first.
>
>
> I've seen this asked a few times. Maybe we should make it more obvious.
>
>
>
> http://spark.apache.org/docs/latest/building-with-maven.html
>
> Spark Tests in Maven
>
> Tests are run by default via the ScalaTest Maven plugin
> <http://www.scalatest.org/user_guide/using_the_scalatest_maven_plugin>.
> Some of the require Spark to be packaged first, so always run mvn package
>  with -DskipTests the first time. You can then run the tests with mvn
> -Dhadoop.version=... test.
>
> The ScalaTest plugin also supports running only a specific test suite as
> follows:
>
> mvn -Dhadoop.version=... -DwildcardSuites=org.apache.spark.repl.ReplSuite
> test
>
>
>
>
>
> On Sun, Jul 27, 2014 at 7:07 PM, Stephen Boesch <java...@gmail.com> wrote:
>
> > I have pulled latest from github this afternoon.   There are many many
> > errors:
> >
> > <source_home>/assembly/target/scala-2.10: No such file or directory
> >
> > This causes many tests to fail.
> >
> > Here is the command line I am running
> >
> >     mvn -Pyarn -Phadoop-2.3 -Phive package test
> >
>

Reply via email to