Is there any extra trick required to use jars on the SPARK_CLASSPATH when 
running spark on yarn?

I have several jars added to the SPARK_CLASSPATH in spark_env.sh   When my job 
runs i print the SPARK_CLASSPATH so i can see that the jars were added to the 
environment that the app master is running in, however even though the jars are 
on the class path I continue to get class not found errors.

I have also tried setting SPARK_CLASSPATH via SPARK_YARN_USER_ENV

Reply via email to