Ido, when you say add external JARS, do you mean by -addJars which adding some 
jar for SparkContext to use in the AM env?

If so, I think you don't need it for yarn-cilent mode at all, for yarn-client 
mode, SparkContext running locally, I think you just need to make sure those 
jars are in the java classpath.

And for those need by executors / tasks, I think , you can package it as Matei 
said. Or maybe we can expose some env for yarn-client mode to allowing adding 
multiple jars as needed.

Best Regards,
Raymond Liu

From: Matei Zaharia [mailto:matei.zaha...@gmail.com]
Sent: Tuesday, December 24, 2013 1:17 PM
To: user@spark.incubator.apache.org
Subject: Re: Unable to load additional JARs in yarn-client mode

I'm surprised by this, but one way that will definitely work is to assemble 
your application into a single JAR. If passing them to the constructor doesn't 
work, that's probably a bug.

Matei

On Dec 23, 2013, at 12:03 PM, Karavany, Ido 
<ido.karav...@intel.com<mailto:ido.karav...@intel.com>> wrote:


Hi All,

For our application we need to use the yarn-client mode featured in 0.8.1. 
(Yarn 2.0.5)
We've successfully executed it both yarn-client and yarn-standalone with our 
java applications.

While in yarn-standalone there is a way to add external JARs - we couldn't find 
a way to add those in  yarn-client.

Adding jars in spark context constructor or setting the SPARK_CLASSPATH didn't 
work as well.

Are we missing something?
Can you please advise?
If it is currently impossible - can you advise a patch / workaround?

It is crucial for us to get it working with external dependencies.

Many Thanks,
Ido



---------------------------------------------------------------------
Intel Electronics Ltd.

This e-mail and any attachments may contain confidential material for
the sole use of the intended recipient(s). Any review or distribution
by others is strictly prohibited. If you are not the intended
recipient, please contact the sender and delete all copies.

Reply via email to