Hi,

is it possible to add jars to the spark executor/  driver classpath with
the relative path of the jar (relative to the spark home)?
I need to set the following settings in the spark conf
- spark.driver.extraClassPath
- spark.executor.extraClassPath

the reason why I need to use the relative path is, if not, if we have a
spark cluster, all the jars needs to be kept in the same folder path.

I know we can pass the jars using the --jars options. but I'd rather prefer
this option.

cheers
-- 
Niranda
@n1r44 <https://twitter.com/N1R44>
https://pythagoreanscript.wordpress.com/

Reply via email to