[ https://issues.apache.org/jira/browse/SPARK-24302?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16482330#comment-16482330 ]
yijukang commented on SPARK-24302: ---------------------------------- [~hyukjin.kwon] OK thanks > when using spark persist(),"KryoException:IndexOutOfBoundsException" happens > ---------------------------------------------------------------------------- > > Key: SPARK-24302 > URL: https://issues.apache.org/jira/browse/SPARK-24302 > Project: Spark > Issue Type: Bug > Components: Input/Output > Affects Versions: 1.6.0 > Reporter: yijukang > Priority: Major > Labels: apache-spark > > my operation is using spark to insert RDD data into Hbase like this: > ------------------------------ > localData.persist() > localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) > -------------------------------------- > this way throw Exception: > com.esotericsoftware.kryo.KryoException: > java.lang.IndexOutOfBoundsException:index:99, Size:6 > Serialization trace: > familyMap (org.apache.hadoop.hbase.client.Put) > at > com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:221) > at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729) > at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42) > at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33) > at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729) > > when i deal with this: > ----------------------------- > localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) > -------------------------------------- > it works well,what the persist() method did? > > > -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org