This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch branch-3.5
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.5 by this push:
     new 47224b39f6c [SPARK-44542][CORE] Eagerly load SparkExitCode class in 
exception handler
47224b39f6c is described below

commit 47224b39f6c937cadf5946870a4dc8d0dabdfa40
Author: Xianjin <xian...@apache.org>
AuthorDate: Sun Jul 30 22:12:39 2023 -0500

    [SPARK-44542][CORE] Eagerly load SparkExitCode class in exception handler
    
    ### What changes were proposed in this pull request?
    1. eagerly load SparkExitCode class in the the SparkUncaughtExceptionHandler
    
    ### Why are the changes needed?
    In some extreme case, it's possible for SparkUncaughtExceptionHandler's 
exit/halt process function calls throw
    an exception if the SparkExitCode is not loaded earlier, See corresponding 
jira: [SPARK-44542](https://issues.apache.org/jira/browse/SPARK-44542) for more 
details.
    
    By eagerly load SparkExitCode class, we can make sure at least the 
halt/exit would work properly.
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    No logic change, hence no new UTs.
    
    Closes #42195 from advancedxy/SPARK-44542.
    
    Authored-by: Xianjin <xian...@apache.org>
    Signed-off-by: Sean Owen <sro...@gmail.com>
    (cherry picked from commit 32498b390db99c9451b14c643456437a023c0d93)
    Signed-off-by: Sean Owen <sro...@gmail.com>
---
 .../scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala | 6 ++++++
 1 file changed, 6 insertions(+)

diff --git 
a/core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala 
b/core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala
index e7712875536..b24129eb369 100644
--- 
a/core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala
+++ 
b/core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala
@@ -28,6 +28,12 @@ import org.apache.spark.internal.Logging
 private[spark] class SparkUncaughtExceptionHandler(val 
exitOnUncaughtException: Boolean = true)
   extends Thread.UncaughtExceptionHandler with Logging {
 
+  locally {
+    // eagerly load SparkExitCode class, so the System.exit and runtime.halt 
have a chance to be
+    // executed when the disk containing Spark jars is corrupted. See 
SPARK-44542 for more details.
+    val _ = SparkExitCode.OOM
+  }
+
   override def uncaughtException(thread: Thread, exception: Throwable): Unit = 
{
     try {
       // Make it explicit that uncaught exceptions are thrown when container 
is shutting down.


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to