[ https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16265522#comment-16265522 ]
Sean Owen commented on SPARK-22393: ----------------------------------- That's great detective work. You may be able to tell better than I then: does this entail changing the way in which spark-shell overrides stuff in the Scala shell? or really just something that has to be fixed and picked up from Scala? 2.12 support is mostly there in Spark, and will require 2.12.4, which has this fix, so OK there. For 2.11, I think Spark is mostly stuck on 2.11.8 because the Scala shell changed between 2.11.8 and 2.11.11 and it was very hard to make one set of code that worked on both. So I'm not sure if it will help in the end for Spark if it's back ported for Scala 2.11. > spark-shell can't find imported types in class constructors, extends clause > --------------------------------------------------------------------------- > > Key: SPARK-22393 > URL: https://issues.apache.org/jira/browse/SPARK-22393 > Project: Spark > Issue Type: Bug > Components: Spark Shell > Affects Versions: 2.0.2, 2.1.2, 2.2.0 > Reporter: Ryan Williams > Priority: Minor > > {code} > $ spark-shell > … > scala> import org.apache.spark.Partition > import org.apache.spark.Partition > scala> class P(p: Partition) > <console>:11: error: not found: type Partition > class P(p: Partition) > ^ > scala> class P(val index: Int) extends Partition > <console>:11: error: not found: type Partition > class P(val index: Int) extends Partition > ^ > {code} > Any class that I {{import}} gives "not found: type ___" when used as a > parameter to a class, or in an extends clause; this applies to classes I > import from JARs I provide via {{--jars}} as well as core Spark classes as > above. > This worked in 1.6.3 but has been broken since 2.0.0. -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org