Hi-

I am having difficulty getting the 1.3.0 Spark shell to find an external
jar.  I have build Spark locally for Scala 2.11 and I am starting the REPL
as follows:

bin/spark-shell --master yarn --jars data-api-es-data-export-4.0.0.jar

I see the following line in the console output:

15/04/09 09:52:15 INFO spark.SparkContext: Added JAR
file:/opt/spark/spark-1.3.0_2.11-hadoop2.3/data-api-es-data-export-4.0.0.jar
at http://192.168.115.31:54421/jars/data-api-es-data-export-4.0.0.jar with
timestamp 1428569535904

but when i try to import anything from this jar, it's simply not available. 
When I try to add the jar manually using the command

:cp /path/to/jar

the classes in the jar are still unavailable. I understand that 2.11 is not
officially supported, but has anyone been able to get an external jar loaded
in the 1.3.0 release?  Is this a known issue? I have tried searching around
for answers but the only thing I've found that may be related is this:

https://issues.apache.org/jira/browse/SPARK-3257

Any/all help is much appreciated.
Thanks
Alex



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/External-JARs-not-loading-Spark-Shell-Scala-2-11-tp22434.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to