You can add your custom jar in the SPARK_CLASSPATH inside spark-env.sh file
and restart the cluster to get it shipped on all the workers. Also you can
use the .setJars option and add the jar while creating the sparkContext.

Thanks
Best Regards

On Tue, Nov 4, 2014 at 8:12 AM, Peng Cheng <pc...@uow.edu.au> wrote:

> Sorry its a timeout duplicate, please remove it
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-make-sure-a-ClassPath-is-always-shipped-to-workers-tp18018p18020.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to