I'm at the latest
commit f0e23a023ce1356bc0f04248605c48d4d08c2d05
Merge: aec9bf9 a197137
Author: Reynold Xin <[email protected]>
Date: Tue Oct 29 01:41:44 2013 -0400
and seeing this when I do a "test-only FileServerSuite":
13/10/30 09:35:04.300 INFO DAGScheduler: Completed ResultTask(0, 0)
13/10/30 09:35:04.307 INFO LocalTaskSetManager: Loss was due to
java.io.StreamCorruptedException
java.io.StreamCorruptedException: invalid type code: AC
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:348)
at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:39)
at
org.apache.spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:101)
at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:71)
at scala.collection.Iterator$$anon$21.hasNext(Iterator.scala:440)
at
org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:26)
at
org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:27)
at
org.apache.spark.Aggregator.combineCombinersByKey(Aggregator.scala:53)
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$combineByKey$2.apply(PairRDDFunctions.scala:95)
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$combineByKey$2.apply(PairRDDFunctions.scala:94)
at
org.apache.spark.rdd.MapPartitionsWithContextRDD.compute(MapPartitionsWithContextRDD.scala:40)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:237)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:226)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:107)
at org.apache.spark.scheduler.Task.run(Task.scala:53)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:212)
at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:680)
Anybody else seen this yet?
I have a really simple PR and this fails without my change, so I may
go ahead and submit it anyways.
--
--
Evan Chan
Staff Engineer
[email protected] |