[ 
https://issues.apache.org/jira/browse/SPARK-20706?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16006139#comment-16006139
 ] 

Sean Owen commented on SPARK-20706:
-----------------------------------

I think this is more of a Scala shell issue than anything, though I can't 
reproduce it in Scala 2.12 (don't have 2.11 handy). Yes, it's taking the value 
of tmp from when the block starts executing for some reason. As you say, it 
doesn't happen in 'normal' code nor in compiled code, I'm guessing, so don't 
know how significant this is.

> Spark-shell not overriding method definition
> --------------------------------------------
>
>                 Key: SPARK-20706
>                 URL: https://issues.apache.org/jira/browse/SPARK-20706
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 2.0.0
>         Environment: Linux, Scala 2.11.8
>            Reporter: Raphael Roth
>            Priority: Minor
>
> In the following example, the definition of myMethod is not correctly updated:
> ------------------------------
> def myMethod()  = "first definition"
> val tmp = myMethod(); val out = tmp
> println(out) // prints "first definition"
> def myMethod()  = "second definition" // override above myMethod
> val tmp = myMethod(); val out = tmp 
> println(out) // should be "second definition" but is "first definition"
> ------------------------------
> I'm using semicolon to force two statements to be compiled at the same time. 
> It's also possible to reproduce the behavior using :paste
> So if I-redefine myMethod, the implementation seems not to be updated in this 
> case. I figured out that the second-last statement (val out = tmp) causes 
> this behavior, if this is moved in a separate block, the code works just fine.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to