[ 
https://issues.apache.org/jira/browse/SPARK-20706?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16008010#comment-16008010
 ] 

Raphael Roth commented on SPARK-20706:
--------------------------------------

also tested with scala console 2.11.8, this works fine. So I assume the bug is 
in spark-shell itself

> Spark-shell not overriding method definition
> --------------------------------------------
>
>                 Key: SPARK-20706
>                 URL: https://issues.apache.org/jira/browse/SPARK-20706
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 2.0.0
>         Environment: Linux, Scala 2.11.8
>            Reporter: Raphael Roth
>            Priority: Minor
>
> In the following example, the definition of myMethod is not correctly updated:
> ------------------------------
> def myMethod()  = "first definition"
> val tmp = myMethod(); val out = tmp
> println(out) // prints "first definition"
> def myMethod()  = "second definition" // override above myMethod
> val tmp = myMethod(); val out = tmp 
> println(out) // should be "second definition" but is "first definition"
> ------------------------------
> I'm using semicolon to force two statements to be compiled at the same time. 
> It's also possible to reproduce the behavior using :paste
> So if I-redefine myMethod, the implementation seems not to be updated in this 
> case. I figured out that the second-last statement (val out = tmp) causes 
> this behavior, if this is moved in a separate block, the code works just fine.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to