Hi,

the problems with %deps were varying error messages about that it shouldn't
be executed before %spark interpreter and also it seemed to me that it was
always working for the first time and if I ran it for second time I got
weird errors about different content in jvm classes...

Anyway what works for me is : using `--jars` for libraries that executors
need and doing only : 

z.addRepo("sonatype-snapshots",
"https://oss.sonatype.org/content/repositories/snapshots";, true)
z.load("com.example:spark-extensions_2.10:0.09-SNAPSHOT")

without %deps, for dependencies that are used in driver/notebook ...

^^ this combination seems to be working and not causing any problems at all

Btw for using hadoop-aws the best thing one can do is : 

--jars=file:/some/path/aws-java-sdk-1.7.14.jar,file:/some/path/hadoop-aws-2.6.0.jar

Otherwise one is always fighting problems with clashes of transitive
dependencies 



--
View this message in context: 
http://apache-zeppelin-users-incubating-mailing-list.75479.x6.nabble.com/Cannot-use-arbitrary-external-jar-files-in-spark-submit-through-Zeppelin-should-be-fixed-tp1277p1797.html
Sent from the Apache Zeppelin Users (incubating) mailing list mailing list 
archive at Nabble.com.

Reply via email to