If i add the jars with the --addJars option it looks like things start
working.  This is kind of ugly because there are alot of jars to add (hbase
jars and dependencies),  It looks like adding the jars to SPARK_CLASSPATH
may successfully add the jars to the works, but doesn't add them to the
applicaiton master / driver program.  Is there a way to add all the jars in
a directory to the application master?




On Mon, Jan 13, 2014 at 1:57 PM, John Zhao <jz...@alpinenow.com> wrote:

> I have been facing the same problem. In my case, it turns out you don’t
> need set any class path, the class not found exception is caused by the
> hadoop version.
> I was trying to submit a spark to hadoop 2.2.0 with yarn. So after I do
> the following, it works fine .
>
> SPARK_HADOOP_VERSION=2.2.0 SPARK_YARN=true ./sbt/sbt *clean* assembly
>
> Hope this can help you.
>
> John.
>
>
> On Jan 13, 2014, at 1:29 PM, Eric Kimbrel <lekimb...@gmail.com> wrote:
>
> Is there any extra trick required to use jars on the SPARK_CLASSPATH when
> running spark on yarn?
>
> I have several jars added to the SPARK_CLASSPATH in spark_env.sh   When my
> job runs i print the SPARK_CLASSPATH so i can see that the jars were added
> to the environment that the app master is running in, however even though
> the jars are on the class path I continue to get class not found errors.
>
> I have also tried setting SPARK_CLASSPATH via SPARK_YARN_USER_ENV
>
>
>

Reply via email to