Once you submits the application, you can check in the driver UI (running on port 4040) Environment Tab to see whether those jars you added got shipped or not. If they are shipped and still you are getting NoClassDef exceptions then it means that you are having a jar conflict which you can resolve by putting the jar with the class in it on the top of your classpath.
Thanks Best Regards On Tue, Jun 9, 2015 at 9:05 AM, Dong Lei <dong...@microsoft.com> wrote: > Hi, spark-users: > > > > I’m using spark-submit to submit multiple jars and files(all in HDFS) to > run a job, with the following command: > > > > Spark-submit > > --class myClass > > --master spark://localhost:7077/ > > --deploy-mode cluster > > --jars hdfs://localhost/1.jar, hdfs://localhost/2.jar > > --files hdfs://localhost/1.txt, hdfs://localhost/2.txt > > hdfs://localhost/main.jar > > > > the stderr in the driver showed java.lang.ClassNotDefException for a class > in 1.jar. > > > > I checked the log that spark has added these jars: > > INFO SparkContext: Added JAR hdfs:// …1.jar > > INFO SparkContext: Added JAR hdfs:// …2.jar > > > > In the folder of the driver, I only saw the main.jar is copied to that > place, *but the other jars and files were not there* > > > > Could someone explain *how should I pass the jars and files* needed by > the main jar to spark? > > > > If my class in main.jar refer to these files with a relative path, *will > spark copy these files into one folder*? > > > > BTW, my class works in a client mode with all jars and files in local. > > > > Thanks > > Dong Lei >