On Tue, Oct 28, 2014 at 6:18 PM, Niklas Wilcke
<1wil...@informatik.uni-hamburg.de> wrote:
> 1. via dev/run-tests script
>     This script executes all tests and take several hours to finish.
> Some tests failed but I can't say which of them. Should this really take
> that long? Can I specify to run only MLlib tests?

Yes, running all tests takes a long long time. It does print which
tests failed, and you can see the errors in the test output.

Did you read 
http://spark.apache.org/docs/latest/building-with-maven.html#spark-tests-in-maven
? This shows how to run just one test suite.

In any Maven project you can try things like "mvn test -pl [module]"
to run just one module's tests.


> 2. directly via maven
> I did the following described in the docs [0].
>
> export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M
> -XX:ReservedCodeCacheSize=512m"
> mvn -Pyarn -Phadoop-2.3 -DskipTests -Phive clean package
> mvn -Pyarn -Phadoop-2.3 -Phive test
>
> This also doesn't work.
> Why do I have to package spark bevore running the tests?

What doesn't work?
Some tests use the built assembly, which requires packaging.


> 3. via sbt
> I tried the following. I freshly cloned spark and checked out the tag
> v1.1.0-rc4.
>
> sbt/sbt "project mllib" test
>
> and get the following exception in several cluster tests.
>
> [info] - task size should be small in both training and prediction ***
> FAILED ***

This just looks like a flaky test failure; I'd try again.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to