Not found in which part of code? If in sparkContext thread, say on AM, 
--addJars should work

If on tasks, then --addjars won't work, you need to use --file=local://xxx etc, 
not sure is it available in 0.8.1. And adding to a single jar should also work, 
if not works, might be something wrong with the assemble? 

Best Regards,
Raymond Liu

From: Eric Kimbrel [mailto:lekimb...@gmail.com] 
Sent: Wednesday, January 08, 2014 11:16 AM
To: user@spark.incubator.apache.org
Subject: Spark on Yarn classpath problems

I am trying to run spark version 0.8.1 on hadoop 2.2.0-cdh5.0.0-beta-1 with 
YARN.  

I am using YARN Client with yarn-standalone mode as described here 
http://spark.incubator.apache.org/docs/latest/running-on-yarn.html

For simplifying matters I'll say my application code is all contained in 
application.jar and it additionally depends on on code in dependency.jar

I launch my spark application as follows:
        

SPARK_JAR=<SPARK_ASSEMBLY_JAR_FILE> ./spark-class 
org.apache.spark.deploy.yarn.Client \
  --jar application.jar \
  --class <My main class> \
  --args <app specific arguments> \
  --num-workers <NUMBER_OF_WORKER_MACHINES> \
  --master-memory <MEMORY_FOR_MASTER> \
  --worker-memory <MEMORY_PER_WORKER> \
  --worker-cores <CORES_PER_WORKER> \   
  --name <application_name> \
  --addJars dependency.jar


Yarn loads the job and starts to execute, but as the job runs it quickly dies 
on class not found exceptions for classes that are specified in dependency.jar.

As an attempted fix i tried including all of the dependencies into a single jar 
"application-with-dependencies.jar"  I specify this jar with -jar option and 
remove the -addJars line.  Unfortunately this did not alleviate the issue and 
the class not found exceptions continued.


Reply via email to