That bug only appears to apply to spark-shell.
Do things work in yarn-client mode or on a standalone cluster? Are you
passing a path with parent directories to addJar?
On Thu, Mar 27, 2014 at 3:01 PM, Sung Hwan Chung
wrote:
> Well, it says that the jar was successfully added but can't referenc
Well, it says that the jar was successfully added but can't reference
classes from it. Does this have anything to do with this bug?
http://stackoverflow.com/questions/22457645/when-to-use-spark-classpath-or-sparkcontext-addjar
On Thu, Mar 27, 2014 at 2:57 PM, Sandy Ryza wrote:
> I just tried t
I just tried this in CDH (only a few patches ahead of 0.9.0) and was able
to include a dependency with --addJars successfully.
Can you share how you're invoking SparkContext.addJar? Anything
interesting in the application master logs?
-Sandy
On Thu, Mar 27, 2014 at 11:35 AM, Sung Hwan Chung
Yea it's in a standalone mode and I did use SparkContext.addJar method and
tried setting setExecutorEnv "SPARK_CLASSPATH", etc. but none of it worked.
I finally made it work by modifying the ClientBase.scala code where I set
'appMasterOnly' to false before the addJars contents were added to
distCa
Hi Sung,
Are you using yarn-standalone mode? Have you specified the --addJars
option with your external jars?
-Sandy
On Wed, Mar 26, 2014 at 1:17 PM, Sung Hwan Chung
wrote:
> Hello, (this is Yarn related)
>
> I'm able to load an external jar and use its classes within
> ApplicationMaster. I w
Hello, (this is Yarn related)
I'm able to load an external jar and use its classes within
ApplicationMaster. I wish to use this jar within worker nodes, so I added
sc.addJar(pathToJar) and ran.
I get the following exception:
org.apache.spark.SparkException: Job aborted: Task 0.0:1 failed 4
times