Regardless of the different ways we have tried deploying a jar together with
Spark, when running a Spark Streaming job with Kryo as serializer on top of
Mesos, we sporadically get the following error (I have truncated a bit):

/16/11/18 08:39:10 ERROR OneForOneBlockFetcher: Failed while starting block
fetches
java.lang.RuntimeException: org.apache.spark.SparkException: Failed to
register classes with Kryo
  at
org.apache.spark.serializer.KryoSerializer.newKryo(KryoSerializer.scala:129)
  at
org.apache.spark.serializer.KryoSerializerInstance.borrowKryo(KryoSerializer.scala:274)
...
  at
org.apache.spark.serializer.SerializerManager.dataSerializeStream(SerializerManager.scala:125)
  at
org.apache.spark.storage.BlockManager$$anonfun$dropFromMemory$3.apply(BlockManager.scala:1265)
  at
org.apache.spark.storage.BlockManager$$anonfun$dropFromMemory$3.apply(BlockManager.scala:1261)
...
Caused by: java.lang.ClassNotFoundException: cics.udr.compound_ran_udr
  at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)/

where "cics.udr.compound_ran_udr" is a class provided by us in a jar.

We know that the jar containing "cics.udr.compound_ran_udr" is being
deployed and works because it is listed in the "Environment" tab in the GUI,
and calculations using this class succeed.

We have tried the following methods of deploying the jar containing the
class
 * Through --jars in spark-submit
 * Through SparkConf.setJar
 * Through spark.driver.extraClassPath and spark.executor.extraClassPath
 * By having it as the main jar used by spark-submit
with no luck. The logs (see attached) recognize that the jar is being added
to the classloader.

We have tried registering the class using
 * SparkConf.registerKryoClasses.
 * spark.kryo.classesToRegister
with no luck.

We are running on Mesos and the jar has been deployed on every machine on
the local file system in the same location.

I would be very grateful for any help or ideas :)



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Sporadic-ClassNotFoundException-with-Kryo-tp28104.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to