I got a error 
     org.apache.spark.SparkException: Job aborted: Task not serializable:
java.io.NotSerializableException:
But the class it complains is a java lib class that I dependents on, that I
can't change it to Serializable.
Is there any method to work this around?

I am using Spark 0.9, spark master using local[2] mode.




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/java-io-NotSerializableException-tp1973.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to