Github user jinxing64 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21342#discussion_r189900950
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/exchange/BroadcastExchangeExec.scala
 ---
    @@ -111,12 +112,18 @@ case class BroadcastExchangeExec(
               SQLMetrics.postDriverMetricUpdates(sparkContext, executionId, 
metrics.values.toSeq)
               broadcasted
             } catch {
    +          // SPARK-24294: To bypass scala bug: 
https://github.com/scala/bug/issues/9554, we throw
    +          // SparkFatalException, which is a subclass of Exception. 
ThreadUtils.awaitResult
    +          // will catch this exception and re-throw the wrapped fatal 
throwable.
               case oe: OutOfMemoryError =>
    --- End diff --
    
    To be clear, @cloud-fan,  do you mean that, ideally  during 
`relationFuture`, all the OOM error thrown of type SparkOutOfMemoryError ? 
(SparkOutOfMemoryError subclass of OutOfMemoryError)


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to