Looking at the pom.xml, I think I found the reason - scalatest is used.
With the following diff:

diff --git a/pom.xml b/pom.xml
index b7df53d..b0da893 100644
--- a/pom.xml
+++ b/pom.xml
@@ -947,7 +947,7 @@
           <version>2.17</version>
           <configuration>
             <!-- Uses scalatest instead -->
-            <skipTests>true</skipTests>
+            <skipTests>false</skipTests>
           </configuration>
         </plugin>
         <plugin>

I was able to run JavaAPISuite using:

mvn test -pl core -Dtest=JavaAPISuite

But it takes a long time ...

Cheers

On Sun, Dec 7, 2014 at 8:56 AM, Koert Kuipers <ko...@tresata.com> wrote:

> hey guys,
> i was able to run the test just fine with:
> $ sbt
> > project core
> > testOnly org.apache.spark.JavaAPISuite
>
> however i found it strange that it didnt run when i do "mvn test -pl
> core", or at least didnt seem like it ran to me. this would mean that when
> someone tests/publishes with maven the japa api does not get tested at all.
>
> On Sun, Dec 7, 2014 at 1:52 AM, Michael Armbrust <mich...@databricks.com>
> wrote:
>
>> Not sure about maven, but you can run that test with sbt:
>>
>> sbt/sbt "sql/test-only org.apache.spark.sql.api.java.JavaAPISuite"
>>
>> On Sat, Dec 6, 2014 at 9:59 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>>> I tried to run tests for core but there were failures. e.g. :
>>>
>>> ^[[32mExternalAppendOnlyMapSuite:^[[0m
>>> ^[[32m- simple insert^[[0m
>>> ^[[32m- insert with collision^[[0m
>>> ^[[32m- ordering^[[0m
>>> ^[[32m- null keys and values^[[0m
>>> ^[[32m- simple aggregator^[[0m
>>> ^[[32m- simple cogroup^[[0m
>>> Spark assembly has been built with Hive, including Datanucleus jars on
>>> classpath
>>> ^[[31m- spilling *** FAILED ***^[[0m
>>> ^[[31m  org.apache.spark.SparkException: Job aborted due to stage
>>> failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task
>>> 0.3 in stage 0.0 (TID 6, localhost): java.lang.ClassNotFoundException:
>>> org.apache.spark.rdd.RDD$$anonfun$map$1^[[0m
>>> ^[[31m  at java.net.URLClassLoader$1.run(URLClassLoader.java:366)^[[0m
>>> ^[[31m  at java.net.URLClassLoader$1.run(URLClassLoader.java:355)^[[0m
>>> ^[[31m  at java.security.AccessController.doPrivileged(Native
>>> Method)^[[0m
>>> ^[[31m  at
>>> java.net.URLClassLoader.findClass(URLClassLoader.java:354)^[[0m
>>> ^[[31m  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)^[[0m
>>> ^[[31m  at java.lang.ClassLoader.loadClass(ClassLoader.java:357)^[[0m
>>> ^[[31m  at java.lang.Class.forName0(Native Method)^[[0m
>>> ^[[31m  at java.lang.Class.forName(Class.java:270)^[[0m
>>>
>>> BTW I didn't find JavaAPISuite in test output either.
>>>
>>> Cheers
>>>
>>> On Sat, Dec 6, 2014 at 9:12 PM, Koert Kuipers <ko...@tresata.com> wrote:
>>>
>>>> Ted,
>>>> i mean
>>>> core/src/test/java/org/apache/spark/JavaAPISuite.java
>>>>
>>>> On Sat, Dec 6, 2014 at 9:27 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>>>>
>>>>> Pardon me, the test is here:
>>>>>
>>>>> sql/core/src/test/java/org/apache/spark/sql/api/java/JavaAPISuite.java
>>>>>
>>>>> You can run 'mvn test' under sql/core
>>>>>
>>>>> Cheers
>>>>>
>>>>> On Sat, Dec 6, 2014 at 5:55 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>>>>>
>>>>>> In master branch, I only found JavaAPISuite in comment:
>>>>>>
>>>>>> spark tyu$ find . -name '*.scala' -exec grep JavaAPISuite {} \; -print
>>>>>>    * For usage example, see test case JavaAPISuite.testJavaJdbcRDD.
>>>>>>    * converted into a `Object` array. For usage example, see test
>>>>>> case JavaAPISuite.testJavaJdbcRDD.
>>>>>> ./core/src/main/scala/org/apache/spark/rdd/JdbcRDD.scala
>>>>>>
>>>>>> FYI
>>>>>>
>>>>>> On Sat, Dec 6, 2014 at 5:43 PM, Koert Kuipers <ko...@tresata.com>
>>>>>> wrote:
>>>>>>
>>>>>>> when i run "mvn test -pl core", i dont see JavaAPISuite being run.
>>>>>>> or if it is, its being very very quiet about it. is this by design?
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to