I have been facing the same problem. In my case, it turns out you don’t need 
set any class path, the class not found exception is caused by the hadoop 
version.
I was trying to submit a spark to hadoop 2.2.0 with yarn. So after I do the 
following, it works fine . 
SPARK_HADOOP_VERSION=2.2.0 SPARK_YARN=true ./sbt/sbt clean assembly
Hope this can help you.

John.


On Jan 13, 2014, at 1:29 PM, Eric Kimbrel <lekimb...@gmail.com> wrote:

> Is there any extra trick required to use jars on the SPARK_CLASSPATH when 
> running spark on yarn?
> 
> I have several jars added to the SPARK_CLASSPATH in spark_env.sh   When my 
> job runs i print the SPARK_CLASSPATH so i can see that the jars were added to 
> the environment that the app master is running in, however even though the 
> jars are on the class path I continue to get class not found errors.
> 
> I have also tried setting SPARK_CLASSPATH via SPARK_YARN_USER_ENV

Reply via email to