I encounter the same problem with hadoop.fs.Configuration (very complex,
unserializable class)
basically if your closure contains any instance (not constant
object/singleton! they are in the jar, not closure) that doesn't inherit
Serializable, or their properties doesn't inherit Serializable, you are
going to have this error.
My solution is wrap your thing with SerializableWritable, and use that in
your function closure. If its going to be heavily reused, wrap that thingy
again with sc.broadcast. If you read spark source code, you will find a lot
of instances like this.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Serialization-problem-in-Spark-tp7049p8205.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to