For the second exception, was there anything following SparkException which would give us more clue ?
Can you tell us how EsDoc is structured ? Thanks On Fri, Jul 31, 2015 at 8:42 AM, Wang, Ningjun (LNG-NPV) < ningjun.w...@lexisnexis.com> wrote: > Does anybody have any idea how to solve this problem? > > > > Ningjun > > > > *From:* Wang, Ningjun (LNG-NPV) > *Sent:* Thursday, July 30, 2015 11:06 AM > *To:* user@spark.apache.org > *Subject:* How to register array class with Kyro in spark-defaults.conf > > > > I register my class with Kyro in spark-defaults.conf as follow > > > > spark.serializer > org.apache.spark.serializer.KryoSerializer > > spark.kryo.registrationRequired true > > spark.kryo.classesToRegister ltn.analytics.es.EsDoc > > > > But I got the following exception > > > > java.lang.IllegalArgumentException: Class is not registered: > ltn.analytics.es.EsDoc[] > > Note: To register this class use: > kryo.register(ltn.analytics.es.EsDoc[].class); > > at > com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:442) > > at > com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:79) > > at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:472) > > at > com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:565) > > at > org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:162) > > at > org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213) > > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > > at java.lang.Thread.run(Thread.java:745) > > > > > > The error message seems to suggest that I should also register the array > class EsDoc[]. So I add it to spark-defaults.conf as follow > > > > spark.kryo.classesToRegister > ltn.analytics.es.EsDoc,ltn.analytics.es.EsDoc[] > > > > Then I got the following error > > > > org.apache.spark.SparkException: Failed to register classes with Kryo > > at > org.apache.spark.serializer.KryoSerializer.newKryo(KryoSerializer.scala:101) > > at > org.apache.spark.serializer.KryoSerializerInstance.<init>(KryoSerializer.scala:153) > > at > org.apache.spark.serializer.KryoSerializer.newInstance(KryoSerializer.scala:115) > > at > org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:200) > > at > org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:101) > > at > org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:84) > > at > org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34) > > at > org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29) > > at > org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62) > > at > org.apache.spark.SparkContext.broadcast(SparkContext.scala:1051) > > at ltn.analytics.index.Index.addDocuments(Index.scala:82) > > > > Please advise. > > > > Thanks. > > Ningjun >