Jakob Odersky created SPARK-11832:
-------------------------------------

             Summary: Spark shell does not work from sbt with scala 2.11
                 Key: SPARK-11832
                 URL: https://issues.apache.org/jira/browse/SPARK-11832
             Project: Spark
          Issue Type: Bug
          Components: Spark Shell
            Reporter: Jakob Odersky
            Priority: Minor


Using Scala 2.11, running the spark shell task from within sbt fails, however 
running it from a distribution works.

h3. Steps to reproduce
# change scala version {{dev/change-scala-version.sh 2.11}}
# start sbt {{build/sbt -Dscala-2.11}}
# run shell task {{sparkShell}}

h3. Stacktrace
{code}
Failed to initialize compiler: object scala in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.
Exception in thread "main" java.lang.NullPointerException
        at 
scala.reflect.internal.SymbolTable.exitingPhase(SymbolTable.scala:256)
        at 
scala.tools.nsc.interpreter.IMain$Request.x$20$lzycompute(IMain.scala:894)
        at scala.tools.nsc.interpreter.IMain$Request.x$20(IMain.scala:893)
        at 
scala.tools.nsc.interpreter.IMain$Request.importsPreamble$lzycompute(IMain.scala:893)
        at 
scala.tools.nsc.interpreter.IMain$Request.importsPreamble(IMain.scala:893)
        at 
scala.tools.nsc.interpreter.IMain$Request$Wrapper.preamble(IMain.scala:915)
        at 
scala.tools.nsc.interpreter.IMain$CodeAssembler$$anonfun$apply$23.apply(IMain.scala:1325)
        at 
scala.tools.nsc.interpreter.IMain$CodeAssembler$$anonfun$apply$23.apply(IMain.scala:1324)
        at scala.tools.nsc.util.package$.stringFromWriter(package.scala:64)
        at 
scala.tools.nsc.interpreter.IMain$CodeAssembler$class.apply(IMain.scala:1324)
        at 
scala.tools.nsc.interpreter.IMain$Request$Wrapper.apply(IMain.scala:906)
        at 
scala.tools.nsc.interpreter.IMain$Request.compile$lzycompute(IMain.scala:995)
        at scala.tools.nsc.interpreter.IMain$Request.compile(IMain.scala:990)
        at scala.tools.nsc.interpreter.IMain.compile(IMain.scala:577)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:563)
        at scala.tools.nsc.interpreter.ILoop.reallyInterpret$1(ILoop.scala:802)
        at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:836)
        at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:694)
        at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:404)
        at 
org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcZ$sp(SparkILoop.scala:39)
        at 
org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:38)
        at 
org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:38)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:213)
        at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:38)
        at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:94)
        at 
scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:922)
        at 
scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
        at 
scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
        at 
scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
        at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:911)
        at org.apache.spark.repl.Main$.main(Main.scala:49)
        at org.apache.spark.repl.Main.main(Main.scala)
{code}

h3. Workaround
In {{repl/scala-2.11/src/main/scala/org/apache/spark/repl/Main.scala}}, append 
to the repl settings {{s.usejavacp.value = true}}

I haven't looked into the details of {{scala.tools.nsc.Settings}}, maybe 
someone has an idea of what's going on.
Also, to be clear, this bug only affects scala 2.11 from within sbt; calling 
spark-shell from a distribution or from anywhere using scala 2.10 works.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to