Earthson Lu created SPARK-6465:
----------------------------------

             Summary: GenericRowWithSchema: KryoException: Class cannot be 
created (missing no-arg constructor):
                 Key: SPARK-6465
                 URL: https://issues.apache.org/jira/browse/SPARK-6465
             Project: Spark
          Issue Type: Bug
          Components: DataFrame
    Affects Versions: 1.3.0
         Environment: Spark 1.3, YARN 2.6.0, CentOS
            Reporter: Earthson Lu


I can not find a issue for this. 

register for GenericRowWithSchema is lost in  
org.apache.spark.sql.execution.SparkSqlSerializer.

Is this the only thing we need to do?

Here is the log
```
15/03/23 16:21:00 WARN TaskSetManager: Lost task 9.0 in stage 20.0 (TID 31978, 
datanode06.site): com.esotericsoftware.kryo.KryoException: Class cannot be 
created (missing no-arg constructor): 
org.apache.spark.sql.catalyst.expressions.GenericRowWithSchema
        at com.esotericsoftware.kryo.Kryo.newInstantiator(Kryo.java:1050)
        at com.esotericsoftware.kryo.Kryo.newInstance(Kryo.java:1062)
        at 
com.esotericsoftware.kryo.serializers.FieldSerializer.create(FieldSerializer.java:228)
        at 
com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:217)
        at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:732)
        at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42)
        at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33)
        at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:732)
        at 
org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSerializer.scala:138)
        at 
org.apache.spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:133)
        at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:71)
        at 
org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:32)
        at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
        at 
org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:32)
        at 
org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
        at 
org.apache.spark.sql.execution.joins.HashJoin$$anon$1.hasNext(HashJoin.scala:66)
        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
        at 
org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:217)
        at 
org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
        at org.apache.spark.scheduler.Task.run(Task.scala:64)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
        at java.lang.Thread.run(Thread.java:722)
```



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to