[ 
https://issues.apache.org/jira/browse/SPARK-40729?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-40729:
-----------------------------
        Parent: SPARK-40730
    Issue Type: Sub-task  (was: Improvement)

> Spark-shell run failed with Java 19
> -----------------------------------
>
>                 Key: SPARK-40729
>                 URL: https://issues.apache.org/jira/browse/SPARK-40729
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Shell
>    Affects Versions: 3.4.0
>            Reporter: Yang Jie
>            Priority: Major
>
> {code:java}
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
> setLogLevel(newLevel).
> 22/10/10 19:37:59 WARN NativeCodeLoader: Unable to load native-hadoop library 
> for your platform... using builtin-java classes where applicable
> 22/10/10 19:38:00 WARN Utils: Service 'SparkUI' could not bind on port 4040. 
> Attempting port 4041.
> Spark context Web UI available at http://localhost:4041
> Spark context available as 'sc' (master = local, app id = 
> local-1665401880396).
> Spark session available as 'spark'.
> Welcome to
>       ____              __
>      / __/__  ___ _____/ /__
>     _\ \/ _ \/ _ `/ __/  '_/
>    /___/ .__/\_,_/_/ /_/\_\   version 3.3.0
>       /_/
>          
> Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 19)
> Type in expressions to have them evaluated.
> Type :help for more information.
> scala> :paste
> // Entering paste mode (ctrl-D to finish)
> var array = new Array[Int](5)
> val broadcastArray = sc.broadcast(array)
> sc.parallelize(0 to 4).map(x => broadcastArray.value(x)).collect()
> array(0) = 5
> sc.parallelize(0 to 4).map(x => broadcastArray.value(x)).collect()
> // Exiting paste mode, now interpreting.
> java.lang.InternalError: java.lang.IllegalAccessException: final field has no 
> write access: $Lambda$2365/0x000000080199eef0.arg$1/putField, from class 
> java.lang.Object (module java.base)
>   at 
> java.base/jdk.internal.reflect.MethodHandleAccessorFactory.newFieldAccessor(MethodHandleAccessorFactory.java:167)
>   at 
> java.base/jdk.internal.reflect.ReflectionFactory.newFieldAccessor(ReflectionFactory.java:145)
>   at 
> java.base/java.lang.reflect.Field.acquireOverrideFieldAccessor(Field.java:1184)
>   at 
> java.base/java.lang.reflect.Field.getOverrideFieldAccessor(Field.java:1153)
>   at java.base/java.lang.reflect.Field.set(Field.java:820)
>   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:406)
>   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:163)
>   at org.apache.spark.SparkContext.clean(SparkContext.scala:2491)
>   at org.apache.spark.rdd.RDD.$anonfun$map$1(RDD.scala:414)
>   at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>   at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
>   at org.apache.spark.rdd.RDD.withScope(RDD.scala:406)
>   at org.apache.spark.rdd.RDD.map(RDD.scala:413)
>   ... 43 elided
> Caused by: java.lang.IllegalAccessException: final field has no write access: 
> $Lambda$2365/0x000000080199eef0.arg$1/putField, from class java.lang.Object 
> (module java.base)
>   at 
> java.base/java.lang.invoke.MemberName.makeAccessException(MemberName.java:955)
>   at 
> java.base/java.lang.invoke.MethodHandles$Lookup.unreflectField(MethodHandles.java:3511)
>   at 
> java.base/java.lang.invoke.MethodHandles$Lookup.unreflectSetter(MethodHandles.java:3502)
>   at 
> java.base/java.lang.invoke.MethodHandleImpl$1.unreflectField(MethodHandleImpl.java:1630)
>   at 
> java.base/jdk.internal.reflect.MethodHandleAccessorFactory.newFieldAccessor(MethodHandleAccessorFactory.java:145)
>   ... 55 more
> scala>  {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to