Good to know, thanks for pointing this out to me!
On 23/04/2014 19:55, Sandy Ryza wrote:
Ah, you're right about SPARK_CLASSPATH and ADD_JARS. My bad.
SPARK_YARN_APP_JAR is going away entirely -
https://issues.apache.org/jira/browse/SPARK-1053
On Wed, Apr 23, 2014 at 8:07 AM, Christophe
Ah, you're right about SPARK_CLASSPATH and ADD_JARS. My bad.
SPARK_YARN_APP_JAR is going away entirely -
https://issues.apache.org/jira/browse/SPARK-1053
On Wed, Apr 23, 2014 at 8:07 AM, Christophe Préaud
christophe.pre...@kelkoo.com wrote:
Hi Sandy,
Thanks for your reply !
I thought
Hi Christophe,
Adding the jars to both SPARK_CLASSPATH and ADD_JARS is required. The
former makes them available to the spark-shell driver process, and the
latter tells Spark to make them available to the executor processes running
on the cluster.
-Sandy
On Wed, Apr 16, 2014 at 9:27 AM,
Hi,
I am running Spark 0.9.1 on a YARN cluster, and I am wondering which is the
correct way to add external jars when running a spark shell on a YARN cluster.
Packaging all this dependencies in an assembly which path is then set in
SPARK_YARN_APP_JAR (as written in the doc: