Running JavaAPISuite (master branch) on Linux, I got:

testGuavaOptional(org.apache.spark.JavaAPISuite)  Time elapsed: 32.945 sec
 <<< ERROR!
org.apache.spark.SparkException: Job aborted due to stage failure: Master
removed our application: FAILED
at org.apache.spark.scheduler.DAGScheduler.org
$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1214)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1203)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1202)
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1202)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
at scala.Option.foreach(Option.scala:236)
at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:696)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1420)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessActor.aroundReceive(DAGScheduler.scala:1375)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

FYI

On Sun, Dec 7, 2014 at 3:25 PM, Sean Owen <[email protected]> wrote:

> I think it's a known issue:
>
> https://issues.apache.org/jira/browse/SPARK-4159
> https://issues.apache.org/jira/browse/SPARK-661
>
> I got bit by this too recently and meant to look into it.
>
> On Sun, Dec 7, 2014 at 4:50 PM, Koert Kuipers <[email protected]> wrote:
> > so as part of the official build the java api does not get tested then?
> > i am sure there is a good reason for it, but thats surprising to me.
> >
> > On Sun, Dec 7, 2014 at 12:19 PM, Ted Yu <[email protected]> wrote:
> >>
> >> Looking at the pom.xml, I think I found the reason - scalatest is used.
> >> With the following diff:
> >>
> >> diff --git a/pom.xml b/pom.xml
> >> index b7df53d..b0da893 100644
> >> --- a/pom.xml
> >> +++ b/pom.xml
> >> @@ -947,7 +947,7 @@
> >>            <version>2.17</version>
> >>            <configuration>
> >>              <!-- Uses scalatest instead -->
> >> -            <skipTests>true</skipTests>
> >> +            <skipTests>false</skipTests>
> >>            </configuration>
> >>          </plugin>
> >>          <plugin>
> >>
> >> I was able to run JavaAPISuite using:
> >>
> >> mvn test -pl core -Dtest=JavaAPISuite
> >>
> >> But it takes a long time ...
> >>
> >> Cheers
> >>
> >> On Sun, Dec 7, 2014 at 8:56 AM, Koert Kuipers <[email protected]>
> wrote:
> >>>
> >>> hey guys,
> >>> i was able to run the test just fine with:
> >>> $ sbt
> >>> > project core
> >>> > testOnly org.apache.spark.JavaAPISuite
> >>>
> >>> however i found it strange that it didnt run when i do "mvn test -pl
> >>> core", or at least didnt seem like it ran to me. this would mean that
> when
> >>> someone tests/publishes with maven the japa api does not get tested at
> all.
> >>>
> >>> On Sun, Dec 7, 2014 at 1:52 AM, Michael Armbrust <
> [email protected]>
> >>> wrote:
> >>>>
> >>>> Not sure about maven, but you can run that test with sbt:
> >>>>
> >>>> sbt/sbt "sql/test-only org.apache.spark.sql.api.java.JavaAPISuite"
> >>>>
> >>>> On Sat, Dec 6, 2014 at 9:59 PM, Ted Yu <[email protected]> wrote:
> >>>>>
> >>>>> I tried to run tests for core but there were failures. e.g. :
> >>>>>
> >>>>> ^[[32mExternalAppendOnlyMapSuite:^[[0m
> >>>>> ^[[32m- simple insert^[[0m
> >>>>> ^[[32m- insert with collision^[[0m
> >>>>> ^[[32m- ordering^[[0m
> >>>>> ^[[32m- null keys and values^[[0m
> >>>>> ^[[32m- simple aggregator^[[0m
> >>>>> ^[[32m- simple cogroup^[[0m
> >>>>> Spark assembly has been built with Hive, including Datanucleus jars
> on
> >>>>> classpath
> >>>>> ^[[31m- spilling *** FAILED ***^[[0m
> >>>>> ^[[31m  org.apache.spark.SparkException: Job aborted due to stage
> >>>>> failure: Task 0 in stage 0.0 failed 4 times, most recent failure:
> Lost task
> >>>>> 0.3 in stage 0.0 (TID 6, localhost):
> java.lang.ClassNotFoundException:
> >>>>> org.apache.spark.rdd.RDD$$anonfun$map$1^[[0m
> >>>>> ^[[31m  at
> java.net.URLClassLoader$1.run(URLClassLoader.java:366)^[[0m
> >>>>> ^[[31m  at
> java.net.URLClassLoader$1.run(URLClassLoader.java:355)^[[0m
> >>>>> ^[[31m  at java.security.AccessController.doPrivileged(Native
> >>>>> Method)^[[0m
> >>>>> ^[[31m  at
> >>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:354)^[[0m
> >>>>> ^[[31m  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)^[[0m
> >>>>> ^[[31m  at java.lang.ClassLoader.loadClass(ClassLoader.java:357)^[[0m
> >>>>> ^[[31m  at java.lang.Class.forName0(Native Method)^[[0m
> >>>>> ^[[31m  at java.lang.Class.forName(Class.java:270)^[[0m
> >>>>>
> >>>>> BTW I didn't find JavaAPISuite in test output either.
> >>>>>
> >>>>> Cheers
> >>>>>
> >>>>> On Sat, Dec 6, 2014 at 9:12 PM, Koert Kuipers <[email protected]>
> >>>>> wrote:
> >>>>>>
> >>>>>> Ted,
> >>>>>> i mean
> >>>>>> core/src/test/java/org/apache/spark/JavaAPISuite.java
> >>>>>>
> >>>>>> On Sat, Dec 6, 2014 at 9:27 PM, Ted Yu <[email protected]> wrote:
> >>>>>>>
> >>>>>>> Pardon me, the test is here:
> >>>>>>>
> >>>>>>>
> >>>>>>>
> sql/core/src/test/java/org/apache/spark/sql/api/java/JavaAPISuite.java
> >>>>>>>
> >>>>>>> You can run 'mvn test' under sql/core
> >>>>>>>
> >>>>>>> Cheers
> >>>>>>>
> >>>>>>> On Sat, Dec 6, 2014 at 5:55 PM, Ted Yu <[email protected]>
> wrote:
> >>>>>>>>
> >>>>>>>> In master branch, I only found JavaAPISuite in comment:
> >>>>>>>>
> >>>>>>>> spark tyu$ find . -name '*.scala' -exec grep JavaAPISuite {} \;
> >>>>>>>> -print
> >>>>>>>>    * For usage example, see test case
> JavaAPISuite.testJavaJdbcRDD.
> >>>>>>>>    * converted into a `Object` array. For usage example, see test
> >>>>>>>> case JavaAPISuite.testJavaJdbcRDD.
> >>>>>>>> ./core/src/main/scala/org/apache/spark/rdd/JdbcRDD.scala
> >>>>>>>>
> >>>>>>>> FYI
> >>>>>>>>
> >>>>>>>> On Sat, Dec 6, 2014 at 5:43 PM, Koert Kuipers <[email protected]>
> >>>>>>>> wrote:
> >>>>>>>>>
> >>>>>>>>> when i run "mvn test -pl core", i dont see JavaAPISuite being
> run.
> >>>>>>>>> or if it is, its being very very quiet about it. is this by
> design?
> >>>>>>>>
> >>>>>>>>
> >>>>>>>
> >>>>>>
> >>>>>
> >>>>
> >>>
> >>
> >
>

Reply via email to