Hi Guys, 

I found an excetpion while running application using 1.2.0-snapshot version.
It shows like this:

2014-12-23 07:45:36,333 | ERROR | [Executor task launch worker-0] |
Exception in task 0.0 in stage 0.0 (TID 0) |
org.apache.spark.Logging$class.logError(Logging.scala:96)
java.io.StreamCorruptedException: invalid stream header: 00546864
        at 
java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:804)
        at java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
        at
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
        at
org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:99)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:86)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:163)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
2014-12-23 07:45:36,357 | INFO  |
[sparkExecutor-akka.actor.default-dispatcher-3] | Got assigned task 1 |
org.apache.spark.Logging$class.logInfo(Logging.scala:59)
2014-12-23 07:45:36,358 | INFO  | [Executor task launch worker-0] | Running
task 1.0 in stage 0.0 (TID 1) |
org.apache.spark.Logging$class.logInfo(Logging.scala:59)
2014-12-23 07:45:36,414 | ERROR | [Executor task launch worker-0] |
Exception in task 1.0 in stage 0.0 (TID 1) |
org.apache.spark.Logging$class.logError(Logging.scala:96)
java.io.StreamCorruptedException: invalid stream header: 00546864
        at 
java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:804)
        at java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
        at
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:57)
        at
org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:57)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:99)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:86)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:163)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)

I know it happened while executor deserialize task. But after checking the
spark code, I found components of one task is very simple: its files, jars
and an Task object contains stageId and partitionId.

I cann't confirm what cause this issue and it is hard to reproduce it.

But I think the application code does not make difference as code segment
here is gransparent to users.

Anyone have some ideas? Thanks for offering help.

P.S. This error occured in every executor of this application.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Corrupted-Exception-while-deserialize-task-tp20857.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to