I'm new to Spark and have run into issues using Kryo for serialization
instead of Java. I have my SparkConf configured as such:

val conf = new SparkConf().setMaster("local").setAppName("test")
        .set("spark.kryo.registrationRequired","false")
        .set("spark.serializer", classOf[KryoSerializer].getName)
        .set("spark.kryo.registrator", classOf[TestRegistrator].getName)

However, when I attempt to run my Spark job, I get the following exception
and stack trace:

java.io.InvalidClassException: ... no valid constructor
at ...
at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
at ...

If I go into the REPL. use that same SparkConf to create a KryoSerializer
manually, and then do a serialization-deserialization loop on my instance, I
get it returned back to me without an exception.

I'm perplexed as to why Spark isn't using Kryo when running my job. Any help
would be greatly appreciated.

Thanks,
Matthew



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-attempts-to-de-serialize-using-JavaSerializer-despite-being-configured-to-use-Kryo-tp21213.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to