This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 6e6a3a1f631a [SPARK-45830][CORE] Refactor `StorageUtils#bufferCleaner`
6e6a3a1f631a is described below

commit 6e6a3a1f631a53e3cf5332f88e52d6c6686ba529
Author: yangjie01 <yangji...@baidu.com>
AuthorDate: Sun Nov 12 14:46:31 2023 -0800

    [SPARK-45830][CORE] Refactor `StorageUtils#bufferCleaner`
    
    ### What changes were proposed in this pull request?
    This pr refactor `StorageUtils#bufferCleaner` as follows:
    - Change the return value of `bufferCleaner` from `DirectBuffer => Unit` to 
`ByteBuffer => Unit`
    - Directly calling `unsafe.invokeCleaner` instead of reflecting calls
    
    ### Why are the changes needed?
    1. After Scala 2.13.9, it is recommended to use the `-release` instead of 
the `-target` for compilation. However, due to `sun.nio.ch` module was not 
exported, this can lead to the issue of class invisibility during Java cross 
compilation, such as building or testing using Java 21 with `-release:17` After 
this pr, the following compilation errors will not occur again when build core 
module using Java 21 with `-release:17`:
    
    ```
    [ERROR] [Error] 
/Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala:71:
 object security is not a member of package sun
    [ERROR] [Error] 
/Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:26:
 object nio is not a member of package sun
    [ERROR] [Error] 
/Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:200:
 not found: type DirectBuffer
    [ERROR] [Error] 
/Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:206:
 not found: type DirectBuffer
    [ERROR] [Error] 
/Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:220:
 not found: type DirectBuffer
    [ERROR] [Error] 
/Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:26:
 Unused import
    ```
    
    2. Direct use of `unsafe.invokeCleaner` provides better performance, 
compared to reflection calls, it is at least 30% faster
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    - Pass GitHub Actions
    - Manual check building core module using Java 21 with `-release:17`, no 
longer compilation failure logs above
    
    Note: There is still an issue with other classes being invisible, which 
needs to be fixed in follow up
    
    ### Was this patch authored or co-authored using generative AI tooling?
    No
    
    Closes #43675 from LuciferYang/bufferCleaner.
    
    Lead-authored-by: yangjie01 <yangji...@baidu.com>
    Co-authored-by: YangJie <yangji...@baidu.com>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 core/src/main/scala/org/apache/spark/storage/StorageUtils.scala | 9 +++------
 1 file changed, 3 insertions(+), 6 deletions(-)

diff --git a/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala 
b/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala
index e73a65e09cb4..c409ee37a06a 100644
--- a/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala
+++ b/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala
@@ -23,7 +23,6 @@ import scala.collection.Map
 import scala.collection.mutable
 
 import sun.misc.Unsafe
-import sun.nio.ch.DirectBuffer
 
 import org.apache.spark.SparkConf
 import org.apache.spark.internal.{config, Logging}
@@ -197,13 +196,11 @@ private[spark] class StorageStatus(
 /** Helper methods for storage-related objects. */
 private[spark] object StorageUtils extends Logging {
 
-  private val bufferCleaner: DirectBuffer => Unit = {
-    val cleanerMethod =
-      Utils.classForName("sun.misc.Unsafe").getMethod("invokeCleaner", 
classOf[ByteBuffer])
+  private val bufferCleaner: ByteBuffer => Unit = {
     val unsafeField = classOf[Unsafe].getDeclaredField("theUnsafe")
     unsafeField.setAccessible(true)
     val unsafe = unsafeField.get(null).asInstanceOf[Unsafe]
-    buffer: DirectBuffer => cleanerMethod.invoke(unsafe, buffer)
+    buffer: ByteBuffer => unsafe.invokeCleaner(buffer)
   }
 
   /**
@@ -217,7 +214,7 @@ private[spark] object StorageUtils extends Logging {
   def dispose(buffer: ByteBuffer): Unit = {
     if (buffer != null && buffer.isInstanceOf[MappedByteBuffer]) {
       logTrace(s"Disposing of $buffer")
-      bufferCleaner(buffer.asInstanceOf[DirectBuffer])
+      bufferCleaner(buffer)
     }
   }
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to