Yea it's in a standalone mode and I did use SparkContext.addJar method and
tried setting setExecutorEnv SPARK_CLASSPATH, etc. but none of it worked.
I finally made it work by modifying the ClientBase.scala code where I set
'appMasterOnly' to false before the addJars contents were added to
Well, it says that the jar was successfully added but can't reference
classes from it. Does this have anything to do with this bug?
http://stackoverflow.com/questions/22457645/when-to-use-spark-classpath-or-sparkcontext-addjar
On Thu, Mar 27, 2014 at 2:57 PM, Sandy Ryza sandy.r...@cloudera.com
That bug only appears to apply to spark-shell.
Do things work in yarn-client mode or on a standalone cluster? Are you
passing a path with parent directories to addJar?
On Thu, Mar 27, 2014 at 3:01 PM, Sung Hwan Chung
coded...@cs.stanford.eduwrote:
Well, it says that the jar was successfully
Hello, (this is Yarn related)
I'm able to load an external jar and use its classes within
ApplicationMaster. I wish to use this jar within worker nodes, so I added
sc.addJar(pathToJar) and ran.
I get the following exception:
org.apache.spark.SparkException: Job aborted: Task 0.0:1 failed 4
Hi Sung,
Are you using yarn-standalone mode? Have you specified the --addJars
option with your external jars?
-Sandy
On Wed, Mar 26, 2014 at 1:17 PM, Sung Hwan Chung
coded...@cs.stanford.eduwrote:
Hello, (this is Yarn related)
I'm able to load an external jar and use its classes within