Yang Jie created SPARK-52088:
--------------------------------

             Summary:  Redesign ClosureCleaner Implementation Due to 
JDK-8309635's Removal of Old Core Reflection and Inability to Modify Private 
Final Fields in Hidden Classes
                 Key: SPARK-52088
                 URL: https://issues.apache.org/jira/browse/SPARK-52088
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core
    Affects Versions: 4.1.0
            Reporter: Yang Jie


The removal of the old core reflection implementation in Java 22 poses a risk 
that the workaround for SPARK-40729 — which involves setting 
`-Djdk.reflect.useDirectMethodHandle=false` to enable the old core reflection — 
may no longer work in the next Java LTS release (if the next Java LTS does not 
revert [JDK-8309635|https://bugs.openjdk.org/browse/JDK-8309635]). We might 
need to consider redesigning the implementation of `ClosureCleaner`.

Currently, when testing the `repl` module with Java 22, the following error 
occurs:

```
build/sbt clean "repl/test"
```


```
[info] - broadcast vars *** FAILED *** (1 second, 141 milliseconds)
[info]   isContain was true Interpreter output contained 'Exception':
[info]   Welcome to
[info]         ____              __
[info]        / __/__  ___ _____/ /__
[info]       _\ \/ _ \/ _ `/ __/  '_/
[info]      /___/ .__/\_,_/_/ /_/\_\   version 4.1.0-SNAPSHOT
[info]         /_/
[info]            
[info]   Using Scala version 2.13.16 (OpenJDK 64-Bit Server VM, Java 22.0.2)
[info]   Type in expressions to have them evaluated.
[info]   Type :help for more information.
[info]   
[info]   scala> 
[info]   scala> var array: Array[Int] = Array(0, 0, 0, 0, 0)
[info]   
[info]   scala> val broadcastArray: 
org.apache.spark.broadcast.Broadcast[Array[Int]] = Broadcast(0)
[info]   
[info]   scala> java.lang.InternalError: java.lang.IllegalAccessException: 
final field has no write access: $Lambda/0x0000060001ecedd8.arg$1/putField, 
from class java.lang.Object (module java.base)
[info]     at 
java.base/jdk.internal.reflect.MethodHandleAccessorFactory.newFieldAccessor(MethodHandleAccessorFactory.java:207)
[info]     at 
java.base/jdk.internal.reflect.ReflectionFactory.newFieldAccessor(ReflectionFactory.java:144)
[info]     at 
java.base/java.lang.reflect.Field.acquireOverrideFieldAccessor(Field.java:1200)
[info]     at 
java.base/java.lang.reflect.Field.getOverrideFieldAccessor(Field.java:1169)
[info]     at java.base/java.lang.reflect.Field.set(Field.java:836)
[info]     at 
org.apache.spark.util.ClosureCleaner$.setFieldAndIgnoreModifiers(ClosureCleaner.scala:564)
[info]     at 
org.apache.spark.util.ClosureCleaner$.cleanupScalaReplClosure(ClosureCleaner.scala:432)
[info]     at 
org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:257)
[info]     at 
org.apache.spark.util.SparkClosureCleaner$.clean(SparkClosureCleaner.scala:39)
[info]     at org.apache.spark.SparkContext.clean(SparkContext.scala:2843)
[info]     at org.apache.spark.rdd.RDD.$anonfun$map$1(RDD.scala:425)
[info]     at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
[info]     at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
[info]     at org.apache.spark.rdd.RDD.withScope(RDD.scala:417)
[info]     at org.apache.spark.rdd.RDD.map(RDD.scala:424)
[info]     ... 79 elided

...

[info] Run completed in 35 seconds, 38 milliseconds.
[info] Total number of tests run: 36
[info] Suites: completed 3, aborted 0
[info] Tests: succeeded 27, failed 9, canceled 0, ignored 0, pending 0
[info] *** 9 TESTS FAILED ***
[error] Failed tests:
[error]     org.apache.spark.repl.SingletonReplSuite
[error]     org.apache.spark.repl.ReplSuite
```

I tried switching to using either `VarHandle` or `Unsafe#putObject`, but 
neither of them worked because the test cases involved modifying a `private 
final field` within a `hidden class`.

 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to