[jira] [Comment Edited] (TOREE-375) Incorrect fully qualified name for spark context
[ https://issues.apache.org/jira/browse/TOREE-375?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15855019#comment-15855019 ] Jakob Odersky edited comment on TOREE-375 at 2/7/17 12:16 AM: -- -Yrepl-class-based strikes again. I managed to track this down to the [refreshDefinitions()|https://github.com/apache/incubator-toree/blob/master/scala-interpreter/src/main/scala-2.11/org/apache/toree/kernel/interpreter/scala/ScalaInterpreterSpecific.scala#L79-L97] function that is called when jars are added dynamically. It appears that the `valueOfTerm` method does not find a value associated to any variables. The following snippet illustrates that this behaviour only happens when the `-Yrepl-class-based` option is set in the repl (the option is required for Spark to correctly serialize objects). {code} import scala.tools.nsc.Settings import scala.tools.nsc.interpreter._ object Main extends App { val settings = new Settings settings.usejavacp.value = true //settings.Yreplclassbased.value = true val iMain: IMain = new IMain(settings) iMain.initializeSynchronous() iMain.interpret("val x = 1") iMain.definedTerms.foreach { name => println("defined term: " + name.toString) iMain.valueOfTerm(name.toString) match { case Some(value) => println("value: " + value) case None => println("no value") } } } {code} Above codes yields: {code} [info] Running foo.Main [info] x: Int = 1 [info] defined term: x [info] value: 1 {code} When setting -Y-repl-class-based by uncommenting above comment: {code} [info] Running foo.Main [info] x: Int = 1 [info] defined term: x [info] no value {code} was (Author: jodersky): -Yrepl-class-based strikes again. I managed to track this down to the [refreshDefinitions()|https://github.com/apache/incubator-toree/blob/master/scala-interpreter/src/main/scala-2.11/org/apache/toree/kernel/interpreter/scala/ScalaInterpreterSpecific.scala#L79-L97] function that is called when jars are added dynamically. It appears that the `valueOfTerm` method does not find a value associated to any variables. The following snippet illustrates that this behaviour only happens when the `-Yrepl-class-based` option is set in the repl (the option is required for Spark to correctly serialize objects). {code} object Main extends App { val settings = new Settings settings.usejavacp.value = true //settings.Yreplclassbased.value = true val iMain: IMain = new IMain(settings) iMain.initializeSynchronous() iMain.interpret("val x = 1") iMain.definedTerms.foreach { name => println("defined term: " + name.toString) iMain.valueOfTerm(name.toString) match { case Some(value) => println("value: " + value) case None => println("no value") } } } {code} Above codes yields: {code} [info] Running foo.Main [info] x: Int = 1 [info] defined term: x [info] value: 1 {code} When setting -Y-repl-class-based by uncommenting above comment: {code} [info] Running foo.Main [info] x: Int = 1 [info] defined term: x [info] no value {code} > Incorrect fully qualified name for spark context > > > Key: TOREE-375 > URL: https://issues.apache.org/jira/browse/TOREE-375 > Project: TOREE > Issue Type: Bug > Environment: Jupyter Notebook with Toree latest master > (1a9c11f5f1381c15b691a716acd0e1f0432a9a35) and Spark 2.0.2, Scala 2.11 >Reporter: Felix Schüler > > When running below snippet in a cell I get a compile error for the MLContext > Constructor. Somehow the fully qualified name of the SparkContext gets messed > up. > The same does not happen when I start a Spark shell with the --jars command > and create the MLContext there. > Snippet (the systemml jar is build with the latest master of SystemML): > {code} > %addjar > file:///home/felix/repos/incubator-systemml/target/systemml-0.13.0-incubating-SNAPSHOT.jar > -f > import org.apache.sysml.api.mlcontext._ > import org.apache.sysml.api.mlcontext.ScriptFactory._ > val ml = new MLContext(sc) > Starting download from > file:///home/felix/repos/incubator-systemml/target/systemml-0.13.0-incubating-SNAPSHOT.jar > Finished download of systemml-0.13.0-incubating-SNAPSHOT.jar > Name: Compile Error > Message: :25: error: overloaded method constructor MLContext with > alternatives: > (x$1: > org.apache.spark.api.java.JavaSparkContext)org.apache.sysml.api.mlcontext.MLContext > > (x$1: > org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext)org.apache.sysml.api.mlcontext.MLContext > cannot be applied to > (org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext) >val ml = new MLContext(sc) > ^ > StackTrace: > {code} -- This message was sent by Atlassian JIRA (v6.3.
[jira] [Comment Edited] (TOREE-375) Incorrect fully qualified name for spark context
[ https://issues.apache.org/jira/browse/TOREE-375?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15876892#comment-15876892 ] Jakob Odersky edited comment on TOREE-375 at 2/21/17 10:37 PM: --- [~fschueler] so I'm actually not entirely sure what's the cause of this bug, however I can confirm it is not specific to Toree. It can be reproduces in a standard spark shell as well: {code}scala> :power Power mode enabled. :phase is at typer. import scala.tools.nsc._, intp.global._, definitions._ Try :help or completions for vals._ and power._ scala> power.intp.addUrlsToClassPath(jar) scala> import org.apache.sysml.api.MLContext import org.apache.sysml.api.MLContext scala> val ml = new MLContext(sc) :50: error: overloaded method constructor MLContext with alternatives: (x$1: org.apache.spark.api.java.JavaSparkContext)org.apache.sysml.api.MLContext (x$1: org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext)org.apache.sysml.api.MLContext cannot be applied to (org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext) val ml = new MLContext(sc) {code} was (Author: jodersky): [~fschueler] so I'm actually not entirely sure what's the cause of this bug, however I can confirm it is not specific to Toree. It can be reproduces in a standard spark shell as well: ``` scala> :power Power mode enabled. :phase is at typer. import scala.tools.nsc._, intp.global._, definitions._ Try :help or completions for vals._ and power._ scala> power.intp.addUrlsToClassPath(jar) scala> import org.apache.sysml.api.MLContext import org.apache.sysml.api.MLContext scala> val ml = new MLContext(sc) :50: error: overloaded method constructor MLContext with alternatives: (x$1: org.apache.spark.api.java.JavaSparkContext)org.apache.sysml.api.MLContext (x$1: org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext)org.apache.sysml.api.MLContext cannot be applied to (org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext) val ml = new MLContext(sc) ``` > Incorrect fully qualified name for spark context > > > Key: TOREE-375 > URL: https://issues.apache.org/jira/browse/TOREE-375 > Project: TOREE > Issue Type: Bug > Environment: Jupyter Notebook with Toree latest master > (1a9c11f5f1381c15b691a716acd0e1f0432a9a35) and Spark 2.0.2, Scala 2.11 >Reporter: Felix Schüler >Priority: Critical > > When running below snippet in a cell I get a compile error for the MLContext > Constructor. Somehow the fully qualified name of the SparkContext gets messed > up. > The same does not happen when I start a Spark shell with the --jars command > and create the MLContext there. > Snippet (the systemml jar is build with the latest master of SystemML): > {code} > %addjar > file:///home/felix/repos/incubator-systemml/target/systemml-0.13.0-incubating-SNAPSHOT.jar > -f > import org.apache.sysml.api.mlcontext._ > import org.apache.sysml.api.mlcontext.ScriptFactory._ > val ml = new MLContext(sc) > Starting download from > file:///home/felix/repos/incubator-systemml/target/systemml-0.13.0-incubating-SNAPSHOT.jar > Finished download of systemml-0.13.0-incubating-SNAPSHOT.jar > Name: Compile Error > Message: :25: error: overloaded method constructor MLContext with > alternatives: > (x$1: > org.apache.spark.api.java.JavaSparkContext)org.apache.sysml.api.mlcontext.MLContext > > (x$1: > org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext)org.apache.sysml.api.mlcontext.MLContext > cannot be applied to > (org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext) >val ml = new MLContext(sc) > ^ > StackTrace: > {code} -- This message was sent by Atlassian JIRA (v6.3.15#6346)
[jira] [Comment Edited] (TOREE-375) Incorrect fully qualified name for spark context
[ https://issues.apache.org/jira/browse/TOREE-375?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15876892#comment-15876892 ] Jakob Odersky edited comment on TOREE-375 at 2/21/17 10:37 PM: --- [~fschueler] so I'm actually not entirely sure what's the cause of this bug, however I can confirm it is not specific to Toree. It can be reproduced in a standard spark shell as well: {code}scala> :power Power mode enabled. :phase is at typer. import scala.tools.nsc._, intp.global._, definitions._ Try :help or completions for vals._ and power._ scala> power.intp.addUrlsToClassPath(jar) scala> import org.apache.sysml.api.MLContext import org.apache.sysml.api.MLContext scala> val ml = new MLContext(sc) :50: error: overloaded method constructor MLContext with alternatives: (x$1: org.apache.spark.api.java.JavaSparkContext)org.apache.sysml.api.MLContext (x$1: org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext)org.apache.sysml.api.MLContext cannot be applied to (org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext) val ml = new MLContext(sc) {code} was (Author: jodersky): [~fschueler] so I'm actually not entirely sure what's the cause of this bug, however I can confirm it is not specific to Toree. It can be reproduces in a standard spark shell as well: {code}scala> :power Power mode enabled. :phase is at typer. import scala.tools.nsc._, intp.global._, definitions._ Try :help or completions for vals._ and power._ scala> power.intp.addUrlsToClassPath(jar) scala> import org.apache.sysml.api.MLContext import org.apache.sysml.api.MLContext scala> val ml = new MLContext(sc) :50: error: overloaded method constructor MLContext with alternatives: (x$1: org.apache.spark.api.java.JavaSparkContext)org.apache.sysml.api.MLContext (x$1: org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext)org.apache.sysml.api.MLContext cannot be applied to (org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext) val ml = new MLContext(sc) {code} > Incorrect fully qualified name for spark context > > > Key: TOREE-375 > URL: https://issues.apache.org/jira/browse/TOREE-375 > Project: TOREE > Issue Type: Bug > Environment: Jupyter Notebook with Toree latest master > (1a9c11f5f1381c15b691a716acd0e1f0432a9a35) and Spark 2.0.2, Scala 2.11 >Reporter: Felix Schüler >Priority: Critical > > When running below snippet in a cell I get a compile error for the MLContext > Constructor. Somehow the fully qualified name of the SparkContext gets messed > up. > The same does not happen when I start a Spark shell with the --jars command > and create the MLContext there. > Snippet (the systemml jar is build with the latest master of SystemML): > {code} > %addjar > file:///home/felix/repos/incubator-systemml/target/systemml-0.13.0-incubating-SNAPSHOT.jar > -f > import org.apache.sysml.api.mlcontext._ > import org.apache.sysml.api.mlcontext.ScriptFactory._ > val ml = new MLContext(sc) > Starting download from > file:///home/felix/repos/incubator-systemml/target/systemml-0.13.0-incubating-SNAPSHOT.jar > Finished download of systemml-0.13.0-incubating-SNAPSHOT.jar > Name: Compile Error > Message: :25: error: overloaded method constructor MLContext with > alternatives: > (x$1: > org.apache.spark.api.java.JavaSparkContext)org.apache.sysml.api.mlcontext.MLContext > > (x$1: > org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext)org.apache.sysml.api.mlcontext.MLContext > cannot be applied to > (org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext) >val ml = new MLContext(sc) > ^ > StackTrace: > {code} -- This message was sent by Atlassian JIRA (v6.3.15#6346)