Davies Liu created SPARK-18233:
----------------------------------

             Summary: Failed to deserialize the task
                 Key: SPARK-18233
                 URL: https://issues.apache.org/jira/browse/SPARK-18233
             Project: Spark
          Issue Type: Bug
            Reporter: Davies Liu


{code}

16/11/02 18:36:32 ERROR Executor: Exception in task 652.0 in stage 27.0 (TID 
21101)
java.io.InvalidClassException: org.apache.spark.executor.TaskMet; serializable 
and externalizable flags conflict
        at java.io.ObjectStreamClass.readNonProxy(ObjectStreamClass.java:698)
        at 
java.io.ObjectInputStream.readClassDescriptor(ObjectInputStream.java:831)
        at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1602)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:253)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to