[
https://issues.apache.org/jira/browse/SPARK-53226?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18035769#comment-18035769
]
Jimmy Wang commented on SPARK-53226:
------------------------------------
Hi [~dongjoon] ,
May I kindly ask if this issue is still open? I think I might be able to help
with it.
> Spark-Shell run fails with Java 22 ~ 25
> ---------------------------------------
>
> Key: SPARK-53226
> URL: https://issues.apache.org/jira/browse/SPARK-53226
> Project: Spark
> Issue Type: Sub-task
> Components: Spark Core
> Affects Versions: 4.1.0
> Reporter: Dongjoon Hyun
> Priority: Blocker
>
> JDK-8305104 and JDK-8309635 (Java 22) removes
> `jdk.reflect.useDirectMethodHandle=false` workaround.
> {code}
> - boolean newImpl = Boolean.getBoolean("jdk.reflect.useDirectMethodHandle");
> {code}
> **Java 25 (Release Candidate)**
> {code}
> $ bin/spark-shell
> WARNING: Using incubator modules: jdk.incubator.vector
> WARNING: package sun.security.action not in java.base
> Using Spark's default log4j profile:
> org/apache/spark/log4j2-defaults.properties
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
> setLogLevel(newLevel).
> Welcome to
> ____ __
> / __/__ ___ _____/ /__
> _\ \/ _ \/ _ `/ __/ '_/
> /___/ .__/\_,_/_/ /_/\_\ version 4.1.0-preview1
> /_/
> Using Scala version 2.13.16 (OpenJDK 64-Bit Server VM, Java 25)
> Type in expressions to have them evaluated.
> Type :help for more information.
> 25/08/09 13:21:21 WARN NativeCodeLoader: Unable to load native-hadoop library
> for your platform... using builtin-java classes where applicable
> WARNING: A terminally deprecated method in sun.misc.Unsafe has been called
> WARNING: sun.misc.Unsafe::allocateMemory has been called by
> io.netty.util.internal.PlatformDependent0$2
> (file:/Users/dongjoon/APACHE/spark-4.1.0-preview1-bin-hadoop3/jars/netty-common-4.1.122.Final.jar)
> WARNING: Please consider reporting this to the maintainers of class
> io.netty.util.internal.PlatformDependent0$2
> WARNING: sun.misc.Unsafe::allocateMemory will be removed in a future release
> Spark context Web UI available at http://localhost:4040
> Spark context available as 'sc' (master = local[*], app id =
> local-1754770881328).
> Spark session available as 'spark'.
> scala> :paste
> // Entering paste mode (ctrl-D to finish)
> var array = new Array[Int](5)
> val broadcastArray = sc.broadcast(array)
> sc.parallelize(0 to 4).map(x => broadcastArray.value(x)).collect()
> array(0) = 5
> sc.parallelize(0 to 4).map(x => broadcastArray.value(x)).collect()
> // Exiting paste mode... now interpreting.
> java.lang.InternalError: java.lang.IllegalAccessException: final field has no
> write access: $Lambda/0x000000f001979948.arg$1/putField, from class
> java.lang.Object (module java.base)
> at
> java.base/jdk.internal.reflect.MethodHandleAccessorFactory.newFieldAccessor(MethodHandleAccessorFactory.java:208)
> at
> java.base/jdk.internal.reflect.ReflectionFactory.newFieldAccessor(ReflectionFactory.java:109)
> at
> java.base/java.lang.reflect.Field.acquireOverrideFieldAccessor(Field.java:1195)
> at
> java.base/java.lang.reflect.Field.getOverrideFieldAccessor(Field.java:1164)
> at java.base/java.lang.reflect.Field.set(Field.java:832)
> at
> org.apache.spark.util.ClosureCleaner$.setFieldAndIgnoreModifiers(ClosureCleaner.scala:564)
> at
> org.apache.spark.util.ClosureCleaner$.cleanupScalaReplClosure(ClosureCleaner.scala:432)
> at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:257)
> at
> org.apache.spark.util.SparkClosureCleaner$.clean(SparkClosureCleaner.scala:39)
> at org.apache.spark.SparkContext.clean(SparkContext.scala:2843)
> at org.apache.spark.rdd.RDD.$anonfun$map$1(RDD.scala:425)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
> at org.apache.spark.rdd.RDD.withScope(RDD.scala:417)
> at org.apache.spark.rdd.RDD.map(RDD.scala:424)
> ... 46 elided
> Caused by: java.lang.IllegalAccessException: final field has no write access:
> $Lambda/0x000000f001979948.arg$1/putField, from class java.lang.Object
> (module java.base)
> at
> java.base/java.lang.invoke.MemberName.makeAccessException(MemberName.java:889)
> at
> java.base/java.lang.invoke.MethodHandles$Lookup.unreflectField(MethodHandles.java:3441)
> at
> java.base/java.lang.invoke.MethodHandles$Lookup.unreflectSetter(MethodHandles.java:3432)
> at
> java.base/java.lang.invoke.MethodHandleImpl$1.unreflectField(MethodHandleImpl.java:1607)
> at
> java.base/jdk.internal.reflect.MethodHandleAccessorFactory.newFieldAccessor(MethodHandleAccessorFactory.java:186)
> ... 60 more
> {code}
> **Java 22**
> {code}
> $ bin/spark-shell
> WARNING: Using incubator modules: jdk.incubator.vector
> Using Spark's default log4j profile:
> org/apache/spark/log4j2-defaults.properties
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
> setLogLevel(newLevel).
> Welcome to
> ____ __
> / __/__ ___ _____/ /__
> _\ \/ _ \/ _ `/ __/ '_/
> /___/ .__/\_,_/_/ /_/\_\ version 4.1.0-preview1
> /_/
> Using Scala version 2.13.16 (OpenJDK 64-Bit Server VM, Java 22.0.2)
> Type in expressions to have them evaluated.
> Type :help for more information.
> 25/08/09 13:28:22 WARN NativeCodeLoader: Unable to load native-hadoop library
> for your platform... using builtin-java classes where applicable
> Spark context Web UI available at http://localhost:4040
> Spark context available as 'sc' (master = local[*], app id =
> local-1754771302910).
> Spark session available as 'spark'.
> scala> :paste
> // Entering paste mode (ctrl-D to finish)
> var array = new Array[Int](5)
> val broadcastArray = sc.broadcast(array)
> sc.parallelize(0 to 4).map(x => broadcastArray.value(x)).collect()
> array(0) = 5
> sc.parallelize(0 to 4).map(x => broadcastArray.value(x)).collect()
> // Exiting paste mode... now interpreting.
> java.lang.InternalError: java.lang.IllegalAccessException: final field has no
> write access: $Lambda/0x000001f000c51468.arg$1/putField, from class
> java.lang.Object (module java.base)
> at
> java.base/jdk.internal.reflect.MethodHandleAccessorFactory.newFieldAccessor(MethodHandleAccessorFactory.java:207)
> at
> java.base/jdk.internal.reflect.ReflectionFactory.newFieldAccessor(ReflectionFactory.java:144)
> at
> java.base/java.lang.reflect.Field.acquireOverrideFieldAccessor(Field.java:1200)
> at
> java.base/java.lang.reflect.Field.getOverrideFieldAccessor(Field.java:1169)
> at java.base/java.lang.reflect.Field.set(Field.java:836)
> at
> org.apache.spark.util.ClosureCleaner$.setFieldAndIgnoreModifiers(ClosureCleaner.scala:564)
> at
> org.apache.spark.util.ClosureCleaner$.cleanupScalaReplClosure(ClosureCleaner.scala:432)
> at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:257)
> at
> org.apache.spark.util.SparkClosureCleaner$.clean(SparkClosureCleaner.scala:39)
> at org.apache.spark.SparkContext.clean(SparkContext.scala:2843)
> at org.apache.spark.rdd.RDD.$anonfun$map$1(RDD.scala:425)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
> at org.apache.spark.rdd.RDD.withScope(RDD.scala:417)
> at org.apache.spark.rdd.RDD.map(RDD.scala:424)
> ... 46 elided
> Caused by: java.lang.IllegalAccessException: final field has no write access:
> $Lambda/0x000001f000c51468.arg$1/putField, from class java.lang.Object
> (module java.base)
> at
> java.base/java.lang.invoke.MemberName.makeAccessException(MemberName.java:889)
> at
> java.base/java.lang.invoke.MethodHandles$Lookup.unreflectField(MethodHandles.java:3609)
> at
> java.base/java.lang.invoke.MethodHandles$Lookup.unreflectSetter(MethodHandles.java:3600)
> at
> java.base/java.lang.invoke.MethodHandleImpl$1.unreflectField(MethodHandleImpl.java:1619)
> at
> java.base/jdk.internal.reflect.MethodHandleAccessorFactory.newFieldAccessor(MethodHandleAccessorFactory.java:185)
> ... 60 more
> {code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]