[ 
https://issues.apache.org/jira/browse/SPARK-29690?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

dinesh updated SPARK-29690:
---------------------------
    Description: 
I 'm facing below problem with Spark Shell. So, in a shell session -
 # I imported following - {color:#57d9a3}{{import 
scala.collection.immutable.HashMap}}{color}
 # Then I realized my mistake and imported correct class - 
{color:#57d9a3}{{import java.util.HashMap}}{color}

But, now I get following error on running my code -

{color:#de350b}{{{{<console>:34: error: reference to HashMap is ambiguous;it is 
imported twice in the same scope byimport java.util.HashMapand import 
scala.collection.immutable.HashMapval colMap = new HashMap[String, 
HashMap[String, String]]()}}}}{color}

 if I have long running Spark Shell session i.e I do not want to close and 
reopen my shell. So, is there a way I can clear previous imports and use 
correct class?

I know that we can also specify full qualified name like - {color:#57d9a3}{{val 
colMap = new java.util.HashMap[String, java.util.HashMap[String, 
String]]()}}{color}

But, 'm looking if there is a way to clear an incorrect loaded class?

 

I thought spark shell picks imports from history the same way REPL does. That 
said, previous HashMap should be shadowed away with new import statement.

{{}}

  was:
I 'm facing below problem with Spark Shell. So, in a shell session -
 # I imported following - {{import scala.collection.immutable.HashMap}}
 # Then I realized my mistake and imported correct class - {{import 
java.util.HashMap}}

But, now I get following error on running my code -

{color:#de350b}{{{{<console>:34: error: reference to HashMap is ambiguous;it is 
imported twice in the same scope byimport java.util.HashMapand import 
scala.collection.immutable.HashMapval colMap = new HashMap[String, 
HashMap[String, String]]()}}}}{color}

 if I have long running Spark Shell session i.e I do not want to close and 
reopen my shell. So, is there a way I can clear previous imports and use 
correct class?

I know that we can also specify full qualified name like - {color:#57d9a3}{{val 
colMap = new java.util.HashMap[String, java.util.HashMap[String, 
String]]()}}{color}

But, 'm looking if there is a way to clear an incorrect loaded class?

 

I thought spark shell picks imports from history the same way REPL does. That 
said, previous HashMap should be shadowed away with new import statement.

{{}}


> Spark Shell  - Clear imports
> ----------------------------
>
>                 Key: SPARK-29690
>                 URL: https://issues.apache.org/jira/browse/SPARK-29690
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 2.2.0
>            Reporter: dinesh
>            Priority: Major
>
> I 'm facing below problem with Spark Shell. So, in a shell session -
>  # I imported following - {color:#57d9a3}{{import 
> scala.collection.immutable.HashMap}}{color}
>  # Then I realized my mistake and imported correct class - 
> {color:#57d9a3}{{import java.util.HashMap}}{color}
> But, now I get following error on running my code -
> {color:#de350b}{{{{<console>:34: error: reference to HashMap is ambiguous;it 
> is imported twice in the same scope byimport java.util.HashMapand import 
> scala.collection.immutable.HashMapval colMap = new HashMap[String, 
> HashMap[String, String]]()}}}}{color}
>  if I have long running Spark Shell session i.e I do not want to close and 
> reopen my shell. So, is there a way I can clear previous imports and use 
> correct class?
> I know that we can also specify full qualified name like - 
> {color:#57d9a3}{{val colMap = new java.util.HashMap[String, 
> java.util.HashMap[String, String]]()}}{color}
> But, 'm looking if there is a way to clear an incorrect loaded class?
>  
> I thought spark shell picks imports from history the same way REPL does. That 
> said, previous HashMap should be shadowed away with new import statement.
> {{}}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to