[ 
https://issues.apache.org/jira/browse/SPARK-4813?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tathagata Das resolved SPARK-4813.
----------------------------------
       Resolution: Fixed
    Fix Version/s: 1.2.1
                   1.1.2
                   1.3.0
                   1.0.3

> ContextWaiter didn't handle 'spurious wakeup'
> ---------------------------------------------
>
>                 Key: SPARK-4813
>                 URL: https://issues.apache.org/jira/browse/SPARK-4813
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>            Reporter: Shixiong Zhu
>             Fix For: 1.0.3, 1.3.0, 1.1.2, 1.2.1
>
>
> According to 
> [javadocs|https://docs.oracle.com/javase/7/docs/api/java/lang/Object.html#wait(long)],
> {quote}
> A thread can also wake up without being notified, interrupted, or timing out, 
> a so-called spurious wakeup. While this will rarely occur in practice, 
> applications must guard against it by testing for the condition that should 
> have caused the thread to be awakened, and continuing to wait if the 
> condition is not satisfied. In other words, waits should always occur in 
> loops, like this one:
>      synchronized (obj) {
>          while (<condition does not hold>)
>              obj.wait(timeout);
>          ... // Perform action appropriate to condition
>      }
> {quote}
> `wait` should always occur in loops.
> But now ContextWaiter.waitForStopOrError doesn't.
> {code}
>   def waitForStopOrError(timeout: Long = -1) = synchronized {
>     // If already had error, then throw it
>     if (error != null) {
>       throw error
>     }
>     // If not already stopped, then wait
>     if (!stopped) {
>       if (timeout < 0) wait() else wait(timeout)
>       if (error != null) throw error
>     }
>   }
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to