In Java 11+, you will need to tell the JVM to allow access to internal
packages in some cases, for any JVM application. You will need flags like
"--add-opens=java.base/sun.nio.ch=ALL-UNNAMED", which you can see in the
pom.xml file for the project.

Spark 3.2 does not necessarily work with Java 17 (3.3 should have support),
but it may well work after you address those flags.

On Tue, Apr 12, 2022 at 7:05 AM Arunachalam Sibisakkaravarthi <
arunacha...@mcruncher.com> wrote:

> Hi guys,
>
> spark-sql_2.12:3.2.1 is used in our application.
>
> It throws following exceptions when the app runs using JRE17
>
> java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ 
> (in unnamed module @0x451f1bd4) cannot access class sun.nio.ch.DirectBuffer 
> (in module java.base) because module java.base does not export sun.nio.ch to 
> unnamed module @0x451f1bd43       at 
> org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:213)4       
> at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)5 at 
> org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:110)6
>     at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:348)7    
> at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:287)8 
>   at org.apache.spark.SparkEnv$.create(SparkEnv.scala:336)9       at 
> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:191)10     at 
> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)11       
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:460)12       at 
> org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)13        
> at 
> org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)14
>    at scala.Option.getOrElse(Option.scala:189)15   at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
>
> How do we fix this?
>
>
>
>
> *Thanks And RegardsSibi.ArunachalammCruncher*
>

Reply via email to