I actually just saw your comment on SPARK-6989 before this message. So I'll 
copy to the mailing list:

I'm not sure I understand what you mean about running on 2.11.6. I'm just 
running the spark-shell command. It in turn is running


  java -cp 
/opt/spark/conf:/opt/spark/lib/spark-assembly-1.3.2-SNAPSHOT-hadoop2.5.0-cdh5.3.3.jar:/etc/hadoop/conf:/opt/spark/lib/jline-2.12.jar
 \
    -Dscala.usejavacp=true -Xms512m -Xmx512m 
org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main 
spark-shell


I built Spark with the included build/mvn script. As far as I can tell, the 
only reference to a specific version of Scala is in the top-level pom file, and 
it says 2.11.2.

> On Apr 17, 2015, at 9:57 PM, Sean Owen <so...@cloudera.com> wrote:
> 
> You are running on 2.11.6, right? of course, it seems like that should
> all work, but it doesn't work for you. My point is that the shell you
> are saying doesn't work is Scala's 2.11.2 shell -- with some light
> modification.
> 
> It's possible that the delta is the problem. I can't entirely make out
> whether the errors are Spark-specific; they involve Spark classes in
> some cases but they're assertion errors from Scala libraries.
> 
> I don't know if this shell is supposed to work even across maintenance
> releases as-is, though that would be very nice. It's not an "API" for
> Scala.
> 
> A good test of whether this idea has any merit would be to run with
> Scala 2.11.2. I'll copy this to JIRA for continuation.
> 
> On Fri, Apr 17, 2015 at 10:31 PM, Michael Allman <mich...@videoamp.com> wrote:
>> Hmmmm... I don't follow. The 2.11.x series is supposed to be binary
>> compatible against user code. Anyway, I was building Spark against 2.11.2
>> and still saw the problems with the REPL. I've created a bug report:
>> 
>> https://issues.apache.org/jira/browse/SPARK-6989
>> 
>> I hope this helps.
>> 
>> Cheers,
>> 
>> Michael
>> 
>> On Apr 17, 2015, at 1:41 AM, Sean Owen <so...@cloudera.com> wrote:
>> 
>> Doesn't this reduce to "Scala isn't compatible with itself across
>> maintenance releases"? Meaning, if this were "fixed" then Scala
>> 2.11.{x < 6} would have similar failures. It's not not-ready; it's
>> just not the Scala 2.11.6 REPL. Still, sure I'd favor breaking the
>> unofficial support to at least make the latest Scala 2.11 the unbroken
>> one.
>> 
>> On Fri, Apr 17, 2015 at 7:58 AM, Michael Allman <mich...@videoamp.com>
>> wrote:
>> 
>> FWIW, this is an essential feature to our use of Spark, and I'm surprised
>> it's not advertised clearly as a limitation in the documentation. All I've
>> found about running Spark 1.3 on 2.11 is here:
>> 
>> http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211
>> 
>> Also, I'm experiencing some serious stability problems simply trying to run
>> the Spark 1.3 Scala 2.11 REPL. Most of the time it fails to load and spews a
>> torrent of compiler assertion failures, etc. See attached.
>> 
>> 
>> 
>> Unfortunately, it appears the Spark 1.3 Scala 2.11 REPL is simply not ready
>> for production use. I was going to file a bug, but it seems clear that the
>> current implementation is going to need to be forward-ported to Scala 2.11.6
>> anyway. We already have an issue for that:
>> 
>> https://issues.apache.org/jira/browse/SPARK-6155
>> 
>> Michael
>> 
>> 
>> On Apr 9, 2015, at 10:29 PM, Prashant Sharma <scrapco...@gmail.com> wrote:
>> 
>> You will have to go to this commit ID
>> 191d7cf2a655d032f160b9fa181730364681d0e7 in Apache spark. [1] Once you are
>> at that commit, you need to review the changes done to the repl code and
>> look for the relevant occurrences of the same code in scala 2.11 repl source
>> and somehow make it all work.
>> 
>> 
>> Thanks,
>> 
>> 
>> 
>> 
>> 
>> 1. http://githowto.com/getting_old_versions
>> 
>> Prashant Sharma
>> 
>> 
>> 
>> On Thu, Apr 9, 2015 at 4:40 PM, Alex Nakos <ana...@gmail.com> wrote:
>> 
>> 
>> Ok, what do i need to do in order to migrate the patch?
>> 
>> Thanks
>> Alex
>> 
>> On Thu, Apr 9, 2015 at 11:54 AM, Prashant Sharma <scrapco...@gmail.com>
>> wrote:
>> 
>> 
>> This is the jira I referred to
>> https://issues.apache.org/jira/browse/SPARK-3256. Another reason for not
>> working on it is evaluating priority between upgrading to scala 2.11.5(it is
>> non trivial I suppose because repl has changed a bit) or migrating that
>> patch is much simpler.
>> 
>> Prashant Sharma
>> 
>> 
>> 
>> On Thu, Apr 9, 2015 at 4:16 PM, Alex Nakos <ana...@gmail.com> wrote:
>> 
>> 
>> Hi-
>> 
>> Was this the JIRA issue?
>> https://issues.apache.org/jira/browse/SPARK-2988
>> 
>> Any help in getting this working would be much appreciated!
>> 
>> Thanks
>> Alex
>> 
>> On Thu, Apr 9, 2015 at 11:32 AM, Prashant Sharma <scrapco...@gmail.com>
>> wrote:
>> 
>> 
>> You are right this needs to be done. I can work on it soon, I was not
>> sure if there is any one even using scala 2.11 spark repl. Actually there is
>> a patch in scala 2.10 shell to support adding jars (Lost the JIRA ID), which
>> has to be ported for scala 2.11 too. If however, you(or anyone else) are
>> planning to work, I can help you ?
>> 
>> Prashant Sharma
>> 
>> 
>> 
>> On Thu, Apr 9, 2015 at 3:08 PM, anakos <ana...@gmail.com> wrote:
>> 
>> 
>> Hi-
>> 
>> I am having difficulty getting the 1.3.0 Spark shell to find an
>> external
>> jar.  I have build Spark locally for Scala 2.11 and I am starting the
>> REPL
>> as follows:
>> 
>> bin/spark-shell --master yarn --jars data-api-es-data-export-4.0.0.jar
>> 
>> I see the following line in the console output:
>> 
>> 15/04/09 09:52:15 INFO spark.SparkContext: Added JAR
>> 
>> file:/opt/spark/spark-1.3.0_2.11-hadoop2.3/data-api-es-data-export-4.0.0.jar
>> at http://192.168.115.31:54421/jars/data-api-es-data-export-4.0.0.jar
>> with
>> timestamp 1428569535904
>> 
>> but when i try to import anything from this jar, it's simply not
>> available.
>> When I try to add the jar manually using the command
>> 
>> :cp /path/to/jar
>> 
>> the classes in the jar are still unavailable. I understand that 2.11
>> is not
>> officially supported, but has anyone been able to get an external jar
>> loaded
>> in the 1.3.0 release?  Is this a known issue? I have tried searching
>> around
>> for answers but the only thing I've found that may be related is this:
>> 
>> https://issues.apache.org/jira/browse/SPARK-3257
>> 
>> Any/all help is much appreciated.
>> Thanks
>> Alex
>> 
>> 
>> 
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/External-JARs-not-loading-Spark-Shell-Scala-2-11-tp22434.html
>> Sent from the Apache Spark User List mailing list archive at
>> Nabble.com.
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to