dinesh created SPARK-29690:
------------------------------

             Summary: Spark Shell  - Clear imports
                 Key: SPARK-29690
                 URL: https://issues.apache.org/jira/browse/SPARK-29690
             Project: Spark
          Issue Type: Bug
          Components: Spark Shell
    Affects Versions: 2.2.0
            Reporter: dinesh


I 'm facing below problem with Spark Shell. So, in a shell session -
 # I imported following - {{import scala.collection.immutable.HashMap}}
 # Then I realized my mistake and imported correct class - {{import 
java.util.HashMap}}

But, now I get following error on running my code -

{{<console>:34: error: reference to HashMap is ambiguous;it is imported twice 
in the same scope byimport java.util.HashMapand import 
scala.collection.immutable.HashMapval colMap = new HashMap[String, 
HashMap[String, String]]()}}

{{}}

 if I have long running Spark Shell session i.e I do not want to close and 
reopen my shell. So, is there a way I can clear previous imports and use 
correct class?

I know that we can also specify full qualified name like - {{val colMap = new 
java.util.HashMap[String, java.util.HashMap[String, String]]()}}

But, 'm looking if there is a way to clear an incorrect loaded class?

 

 I thought spark shell picks imports from history the same way REPL does. That 
said, previous HashMap should be shadowed away with new import statement.

{{}}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to