Hi- Was this the JIRA issue? https://issues.apache.org/jira/browse/SPARK-2988
Any help in getting this working would be much appreciated! Thanks Alex On Thu, Apr 9, 2015 at 11:32 AM, Prashant Sharma <scrapco...@gmail.com> wrote: > You are right this needs to be done. I can work on it soon, I was not sure > if there is any one even using scala 2.11 spark repl. Actually there is a > patch in scala 2.10 shell to support adding jars (Lost the JIRA ID), which > has to be ported for scala 2.11 too. If however, you(or anyone else) are > planning to work, I can help you ? > > Prashant Sharma > > > > On Thu, Apr 9, 2015 at 3:08 PM, anakos <ana...@gmail.com> wrote: > >> Hi- >> >> I am having difficulty getting the 1.3.0 Spark shell to find an external >> jar. I have build Spark locally for Scala 2.11 and I am starting the REPL >> as follows: >> >> bin/spark-shell --master yarn --jars data-api-es-data-export-4.0.0.jar >> >> I see the following line in the console output: >> >> 15/04/09 09:52:15 INFO spark.SparkContext: Added JAR >> >> file:/opt/spark/spark-1.3.0_2.11-hadoop2.3/data-api-es-data-export-4.0.0.jar >> at http://192.168.115.31:54421/jars/data-api-es-data-export-4.0.0.jar >> with >> timestamp 1428569535904 >> >> but when i try to import anything from this jar, it's simply not >> available. >> When I try to add the jar manually using the command >> >> :cp /path/to/jar >> >> the classes in the jar are still unavailable. I understand that 2.11 is >> not >> officially supported, but has anyone been able to get an external jar >> loaded >> in the 1.3.0 release? Is this a known issue? I have tried searching >> around >> for answers but the only thing I've found that may be related is this: >> >> https://issues.apache.org/jira/browse/SPARK-3257 >> >> Any/all help is much appreciated. >> Thanks >> Alex >> >> >> >> -- >> View this message in context: >> http://apache-spark-user-list.1001560.n3.nabble.com/External-JARs-not-loading-Spark-Shell-Scala-2-11-tp22434.html >> Sent from the Apache Spark User List mailing list archive at Nabble.com. >> >> --------------------------------------------------------------------- >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >> For additional commands, e-mail: user-h...@spark.apache.org >> >> >