Hi all, 

I have a uber jar made with maven, the contents are:

my.org.my.classes.Class
...
lib/lib1.jar // 3rd party libs
lib/lib2.jar 

I'm using this kind of jar for hadoop applications and all works fine. 

I added spark libs, scala and everything needed in spark, but when I submit
this jar to spark I get ClassNotFoundExceptions: 

spark-submit --class com.bla.TestJob --driver-memory 512m --master
yarn-client /home/ble/uberjar.jar

Then when the job is running I get this: 
java.lang.NoClassDefFoundError:
com/fasterxml/jackson/datatype/guava/GuavaModule
// usage of jackson's GuavaModule is expected, as the job is using jackson
to read json.


this class is contained in: 
lib/jackson-datatype-guava-2.4.3.jar, which is in the uberjar

So I really don't know what I'm missing. I've tried to use --jars and
SparkContext.addJar (adding the uberjar) with no luck. 

Is there any problem using uberjars with inner jars inside ? 

Thanks!






--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/ClassNotFoundException-with-a-uber-jar-tp25493.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to