[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2018-04-02 Thread Joe Pallas (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16423495#comment-16423495
 ] 

Joe Pallas commented on SPARK-22393:


Following up to my previous comment: [~mpetruska], It looks like the Spark 
retrofit is missing a change that should appear in 
{{SparkILoopInterpreter.SparkImportHandler}}. I believe it should include
{code:java}
override lazy val individualNames: List[Name] = importableSymbolsWithRenames 
map (_._2)
{code}
The corresponding change is in the [2.11 retrofit of SI-9881| 
https://github.com/scala/scala/pull/6195].  Does that look right?

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Assignee: Mark Petruska
>Priority: Minor
> Fix For: 2.3.0
>
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2018-03-31 Thread Joe Pallas (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16421548#comment-16421548
 ] 

Joe Pallas commented on SPARK-22393:


The changes that were imported in [https://github.com/apache/spark/pull/19846] 
don't seem to cover all the cases that the Scala 2.12 changes covered.  To be 
specific, this sequence:
{code}
import scala.reflect.runtime.{universe => ru}
import ru.TypeTag
class C[T: TypeTag](value: T)
{code}
works correctly in Scala 2.12 with -Yrepl-class-based, but does not work in 
spark-shell 2.3.0.

I don't understand the import-handling code enough to understand the problem, 
however.  It figures out that it needs the import for TypeTag but it doesn't 
recognize that the import depends on the previous import:
{noformat}
:9: error: not found: value ru
import ru.TypeTag
   ^
{noformat}

 

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Assignee: Mark Petruska
>Priority: Minor
> Fix For: 2.3.0
>
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-30 Thread Saisai Shao (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16273985#comment-16273985
 ] 

Saisai Shao commented on SPARK-22393:
-

Shall we wait for this before 2.2.1 is out? 

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-29 Thread Mark Petruska (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16270912#comment-16270912
 ] 

Mark Petruska commented on SPARK-22393:
---

[~srowen], [~rdub], I can now confirm that the original bug fix that was pushed 
to Scala 2.12 fixes this issue. Succeeded in retrofitting the same changes into 
Spark-shell, see: https://github.com/apache/spark/pull/19846.
The original fix for Scala 2.12 can be found at: 
https://github.com/scala/scala/pull/5640
The downside is that the code/fix is not the most approachable, could not 
refactor for better readability (and also making sure it compiles :) ).

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-29 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16270903#comment-16270903
 ] 

Apache Spark commented on SPARK-22393:
--

User 'mpetruska' has created a pull request for this issue:
https://github.com/apache/spark/pull/19846

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-29 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16270897#comment-16270897
 ] 

Sean Owen commented on SPARK-22393:
---

OK, so this is basically "fixed for Scala 2.12 only"?

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-29 Thread Mark Petruska (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16270605#comment-16270605
 ] 

Mark Petruska commented on SPARK-22393:
---

Based on the comments above and the comments in the Scala PR 
(https://github.com/scala/scala/pull/6195): it is not practical to push a fix 
to the Scala repo for this.
It seems that we should also avoid changing the "class based" implementation to 
fix this issue.
Adding a workaround for the "scala-2.11" code.

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-25 Thread Mark Petruska (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16265666#comment-16265666
 ] 

Mark Petruska commented on SPARK-22393:
---

I believe the scala repl fix would also fix this. Forcing the interpreter to a 
non-class based behaviour should also fix this issue (`isClassBased`) in my 
opinion. I think the problem is not caused by code and overrides added to 
`SparkILoop`. (Sorry about being this vague, but I have some evidence 
strengthening the above statements, but nothing that would prove it completely.)
[~rdub] Will do additional tests/experimentation with repl 2.11.8 to confirm; 
maybe find a way to force the non-class based behaviour...

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-24 Thread Ryan Williams (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16265543#comment-16265543
 ] 

Ryan Williams commented on SPARK-22393:
---

Just confirming: it's expected that the Scala 2.11 repl works fine, but 
spark-shell does not? Is spark-shell using repl code paths that have this bug 
that the Scala repl avoids?

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-24 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16265522#comment-16265522
 ] 

Sean Owen commented on SPARK-22393:
---

That's great detective work. You may be able to tell better than I then: does 
this entail changing the way in which spark-shell overrides stuff in the Scala 
shell? or really just something that has to be fixed and picked up from Scala?

2.12 support is mostly there in Spark, and will require 2.12.4, which has this 
fix, so OK there. 
For 2.11, I think Spark is mostly stuck on 2.11.8 because the Scala shell 
changed between 2.11.8 and 2.11.11 and it was very hard to make one set of code 
that worked on both. So I'm not sure if it will help in the end for Spark if 
it's back ported for Scala 2.11.

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-24 Thread Mark Petruska (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16265445#comment-16265445
 ] 

Mark Petruska commented on SPARK-22393:
---

Created a PR to retrofit the fix to scala repl 2.11: 
https://github.com/scala/scala/pull/6195

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-24 Thread Mark Petruska (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16265401#comment-16265401
 ] 

Mark Petruska commented on SPARK-22393:
---

The fix for this went into the 2.12 branch of scala: 
https://github.com/scala/scala/pull/5640

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-24 Thread Mark Petruska (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16265398#comment-16265398
 ] 

Mark Petruska commented on SPARK-22393:
---

https://github.com/scala/bug/issues/9881

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-24 Thread Mark Petruska (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16265394#comment-16265394
 ] 

Mark Petruska commented on SPARK-22393:
---

Proof for the above statement:
Tracing the import handler generated by `import java.util.UUID` yields

{code}
MemberHandlers$ImportHandler (refs: java)
expr: java.util
selectors: List(ImportSelector(UUID,17,UUID,17))
targetType: 
importsWildcard: false
implicitSymbols: List()
importedSymbols: List()
importString: import java.util.UUID
{code}

(also in `scala.tools.nsc.interpreter.MemberHandlers.ImportHandler` (excerpt):
{code}
def importedSymbols = individualSymbols ++ wildcardSymbols

...

/** Complete list of names imported by a wildcard */
lazy val wildcardNames: List[Name]   = wildcardSymbols map (_.name)
lazy val individualNames: List[Name] = individualSymbols map (_.name)

/** The names imported by this statement */
override lazy val importedNames: List[Name] = wildcardNames ++ individualNames
{code}
)

and the source code of `scala.tools.nsc.interpreter.Imports.importsCode` 
(excerpt):

{code}
// Single symbol imports might be implicits! See bug #1752.  Rather than
// try to finesse this, we will mimic all imports for now.
def keepHandler(handler: MemberHandler) = handler match {
  // While defining classes in class based mode - implicits are not needed.
  case h: ImportHandler if isClassBased && definesClass => 
h.importedNames.exists(x => wanted.contains(x))
  case _: ImportHandler => true
  case x if generousImports => x.definesImplicit || (x.definedNames exists (d 
=> wanted.exists(w => d.startsWith(w
  case x=> x.definesImplicit || (x.definedNames exists 
wanted)
}
{code}


> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-24 Thread Mark Petruska (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16265385#comment-16265385
 ] 

Mark Petruska commented on SPARK-22393:
---

After digging in deeper, I found that there is a high chance that this is the 
underlying problem: https://issues.scala-lang.org/browse/SI-9881

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-23 Thread Mark Petruska (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16264102#comment-16264102
 ] 

Mark Petruska commented on SPARK-22393:
---

The problem is also reproducible with non-spark related classes.
Example:

{code}
import java.util.UUID
class U(u: UUID)
...
error: not found: type UUID
...
{code}

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-19 Thread Mark Petruska (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16258476#comment-16258476
 ] 

Mark Petruska commented on SPARK-22393:
---

Trace of the 2.11 version:

{code}
...
parse("

class $read extends Serializable { 
  ;

class $iw extends Serializable { 
class $iw extends Serializable { 

  class P(p: Partition) 



}
...
{code}

Strangely the missing import and others are there if the class is wrapped 
inside an object:

{code}
...
parse("

class $read extends Serializable { 
  ;

class $iw extends Serializable { 
val $line3$read = $line3.$read.INSTANCE
import $line3$read.$iw.$iw.`spark`;
class $iw extends Serializable { 
import org.apache.spark.SparkContext._
class $iw extends Serializable { 
class $iw extends Serializable { 
import spark.implicits._
class $iw extends Serializable { 
import spark.sql
class $iw extends Serializable { 
import org.apache.spark.sql.functions._
class $iw extends Serializable { 
import org.apache.spark.Partition
class $iw extends Serializable { 

  object P {
class P(p: Partition)
  } 



}
...
{code}

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-19 Thread Mark Petruska (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16258469#comment-16258469
 ] 

Mark Petruska commented on SPARK-22393:
---

The difference between scala repls 2.11 and 2.12 is seen with _trace_ options 
enabled.
2.12:

{code}
...
parse("

sealed class $read extends _root_.java.io.Serializable { 
  ;

sealed class $iw extends _root_.java.io.Serializable { 
// $line14.$read.INSTANCE definedNames List(), curImps Set()
import org.apache.spark.Partition
sealed class $iw extends _root_.java.io.Serializable { 

class P(p: Partition)


}
...
{code}

Whereas in 2.11, scala repl does not insert `import 
org.apache.spark.Partition`, but only for `ClassDef`s. The underlying reason is 
unknown, will need more investigation.

Until then the workaround `type Partition = org.apache.spark.Partition` seems 
feasible.

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-19 Thread Mark Petruska (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16258468#comment-16258468
 ] 

Mark Petruska commented on SPARK-22393:
---

With the 2.12 build:

{code}
import org.apache.spark.Partition
{code}


|| Code || Result ||
| {code} class P(p: Partition) {code}  | {color:green} OK {color} |
| {code} class P(val index: Int) extends Partition {code} | {color:green} OK 
{color} |
| {code} var p: Partition = _ {code}   | {color:green} OK {color} |
| {code} def a(p: Partition): Int = 0 {code} | {color:green} OK {color} |
| {code} def a(): Partition = p {code} | {color:green} OK {color} |
| {code}
object P {
  class P(p: Partition)
} {code} | {color:green} OK {color} |
| {code}
:paste
import org.apache.spark.Partition
class P(p: Partition)
{code} | {color:green} OK {color} |
| {code}
type Partition = org.apache.spark.Partition
class P(p: Partition)
{code} | {color:green} OK {color} |


To start spark shell 2.12 and the underlying scala repl with _trace_, _info_ 
and _debug_ settings:
{code}
$JAVA_HOME/bin/java -cp 
$SPARK_HOME/conf/:$SPARK_HOME/assembly/target/scala-2.11/jars/* 
-Dscala.usejavacp=true -Dscala.repl.trace=true -Dscala.repl.info=true 
-Dscala.repl.debug=true -Dscala.repl.prompt="spark>>>  " -Xmx1g 
org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name 
"Spark shell" spark-shell
{code}

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-11-19 Thread Mark Petruska (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16258441#comment-16258441
 ] 

Mark Petruska commented on SPARK-22393:
---

Tested with spark-shell build 2.11:

{code}
import org.apache.spark.Partition
{code}


|| Code || Result ||
| {code} class P(p: Partition) {code}  | {color:red} not found: type Partition 
{color} |
| {code} class P(x: Int) extends Partition {code} | {color:red} not found: type 
Partition {color} |
| {code} var p: Partition = _ {code}   | {color:green} OK {color} |
| {code} def a(p: Partition): Int = 0 {code} | {color:green} OK {color} |
| {code} def a(): Partition = p {code} | {color:green} OK {color} |
| {code}
object P {
  class P(p: Partition)
} {code} | {color:green} OK {color} |
| {code}
:paste
import org.apache.spark.Partition
class P(p: Partition)
{code} | {color:green} OK {color} |
| {code}
type Partition = org.apache.spark.Partition
class P(p: Partition)
{code} | {color:green} OK {color} |


To start spark shell and the underlying scala repl with _trace_, _info_ and 
_debug_ settings:
{code}
$JAVA_HOME/bin/java -cp 
$SPARK_HOME/conf/:$SPARK_HOME/assembly/target/scala-2.11/jars/* 
-Dscala.usejavacp=true -Dscala.repl.trace=true -Dscala.repl.info=true 
-Dscala.repl.debug=true -Dscala.repl.prompt="spark>>>  " -Xmx1g 
org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name 
"Spark shell" spark-shell
{code}

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-10-29 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16224194#comment-16224194
 ] 

Sean Owen commented on SPARK-22393:
---

I'm guessing it's something to do with how it overrides the shell 
initialization or classloader. It could be worth trying the 2.12 build and 
shell as the shell integration is a little less hacky. But really no idea off 
the top of my head.

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-10-29 Thread Ryan Williams (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16224158#comment-16224158
 ] 

Ryan Williams commented on SPARK-22393:
---

Everything works fine in a Scala shell ({{scala -cp 
$SPARK_HOME/jars/spark-core_2.11-2.2.0.jar}}) and via {{sbt console}} in a 
project that depends on Spark, so the problem seems specific to {{spark-shell}}.

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

2017-10-29 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16224040#comment-16224040
 ] 

Sean Owen commented on SPARK-22393:
---

That's a weird one. {{class P(p: org.apache.spark.Partition)}} works fine as 
does {{ {import org.apache.spark.Partition; class P(p: Partition)} }}. I think 
this is some subtlety of how the scala shell interpreter works.

> spark-shell can't find imported types in class constructors, extends clause
> ---
>
> Key: SPARK-22393
> URL: https://issues.apache.org/jira/browse/SPARK-22393
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell
>Affects Versions: 2.0.2, 2.1.2, 2.2.0
>Reporter: Ryan Williams
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> :11: error: not found: type Partition
>class P(p: Partition)
>   ^
> scala> class P(val index: Int) extends Partition
> :11: error: not found: type Partition
>class P(val index: Int) extends Partition
>^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a 
> parameter to a class, or in an extends clause; this applies to classes I 
> import from JARs I provide via {{--jars}} as well as core Spark classes as 
> above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org