You can add your custom jar in the SPARK_CLASSPATH inside spark-env.sh file
and restart the cluster to get it shipped on all the workers. Also you can
use the .setJars option and add the jar while creating the sparkContext.
Thanks
Best Regards
On Tue, Nov 4, 2014 at 8:12 AM, Peng Cheng
I have a spark application that deserialize an object 'Seq[Page]', save to
HDFS/S3, and read by another worker to be used elsewhere. The serialization
and deserialization use the same serializer as Spark itself. (Read from
SparkEnv.get.serializer.newInstance())
However I sporadically get this
I have a spark application that deserialize an object 'Seq[Page]', save to
HDFS/S3, and read by another worker to be used elsewhere. The serialization
and deserialization use the same serializer as Spark itself. (Read from
SparkEnv.get.serializer.newInstance())
However I sporadically get this
Sorry its a timeout duplicate, please remove it
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-make-sure-a-ClassPath-is-always-shipped-to-workers-tp18018p18020.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.