k automatically deploy the JAR for you on the DFS cache if Spark
>> is running on cluster mode? I haven't got that far yet to deploy my own
>> one-time JAR for testing. Just setup a local cluster for practice.
>>
>>
>> --
>> Date
ven't got that far yet to deploy my own
> one-time JAR for testing. Just setup a local cluster for practice.
>
>
> ------------------
> Date: Tue, 25 Mar 2014 23:13:58 +0100
> Subject: Re: Using an external jar in the driver, in yarn-standalone mode.
> From: jul
yet to deploy my own one-time
JAR for testing. Just setup a local cluster for practice.
Date: Tue, 25 Mar 2014 23:13:58 +0100
Subject: Re: Using an external jar in the driver, in yarn-standalone mode.
From: julien.ca...@gmail.com
To: user@spark.apache.org
Thanks for your answer.
I am u
Thanks for your answer.
I am using
bin/spark-class org.apache.spark.deploy.yarn.Client --jar myjar.jar
--class myclass ...
myclass in myjar.jar contains a main that initializes a SparkContext in
yarn-standalone mode.
Then I am using some code that uses myotherjar.jar, but I do not execute it
us
by 'use ... my main program' I presume you mean you have a main function in
a class file you want to use as your entry point.
SPARK_CLASSPATH, ADD_JAR, etc add your jars in on the master and the
workers... but they don't on the client.
For that, you're just using ordinary, everyday java/scala - so
Hi Julien,
Have you called SparkContext#addJars?
-Sandy
On Tue, Mar 25, 2014 at 10:05 AM, Julien Carme wrote:
> Hello,
>
> I have been struggling for ages to use an external jar in my spark driver
> program, in yarn-standalone mode. I just want to use in my main program,
> outside the calls to