Hi.

According to the release notes[1], and specifically the ticket Build and Run 
Spark on Java 17 (SPARK-33772)[2], Spark now supports running on Java 17.

However, using Java 17 (Temurin-17+35) with Maven (3.8.6) and 
maven-surefire-plugin (3.0.0-M7), when running a unit test that uses Spark 
(3.3.0), it fails with:

java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in 
unnamed module @0x1e7ba8d9) cannot access class sun.nio.ch.DirectBuffer (in 
module java.base) because module java.base does not export sun.nio.ch to 
unnamed module @0x1e7ba8d9

The full stack is:

java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in 
unnamed module @0x1e7ba8d9) cannot access class sun.nio.ch.DirectBuffer (in 
module java.base) because module java.base does not export sun.nio.ch to 
unnamed module @0x1e7ba8d9
  at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:213)
  at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
  at 
org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:114)
  at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:353)
  at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:290)
  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:339)
  at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:194)
  at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:279)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:464)
  at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2704)
  at 
org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
  at scala.Option.getOrElse(Option.scala:189)
  at 
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947)
  […]

There is a recent StackOverflow question "Java 17 solution for Spark - 
java.lang.NoClassDefFoundError: Could not initialize class 
org.apache.spark.storage.StorageUtils"[3], which was asked only 2 months ago, 
but this pre-dated the Spark 3.3.0 release, and thus predated official support 
for Java 17.  The solution proposed there results in us adding this 
configuration to the Surefire plugin:

<configuration>
  <argLine>--add-exports java.base/sun.nio.ch=ALL-UNNAMED</argLine>
</configuration>

And, yes, this works.

Now, I understand what this flag achieves … without it the JVM module system 
won’t allow Spark to use the sun.nio.ch.DirectBuffer class.  My question is if 
the requirement to add this flag is currently documented somewhere?  I couldn’t 
find its and it’s likely to start affecting people when they switch to Java 17. 
 Right now the web is mostly full of suggestions to use an earlier version of 
Java.

Cheers,

—
Greg.


[1]: https://spark.apache.org/releases/spark-release-3-3-0.html
[2]: https://issues.apache.org/jira/browse/SPARK-33772
[3]: https://stackoverflow.com/questions/72230174
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to