Hi Stephen,
Have you tried the --jars option (with jars separated by commas)? It
should make the given jars available both to the driver and the executors.
I believe one caveat currently is that if you give it a folder it won't
pick up all the jars inside.
-Sandy
On Fri, Aug 15, 2014 at 4:07
Although this has been discussed a number of times here, I am still unclear
how to add user jars to the spark-shell:
a) for importing classes for use directly within the shell interpreter
b) for invoking SparkContext commands with closures referencing user
supplied classes contained within jar's