[ 
https://issues.apache.org/jira/browse/SPARK-33772?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17415357#comment-17415357
 ] 

Lukas Rytz edited comment on SPARK-33772 at 9/15/21, 7:39 AM:
--------------------------------------------------------------

> Could you test it with `--add-opens` please, Lukas Rytz?

I have not tested `-XX:+IgnoreUnrecognizedVMOptions` within the Spark build, 
and I don't know anything about how Java options are managed in Spark. But 
IgnoreUnrecognizedVMOptions has been around for a while, it also works on Java 
6. You need to make sure to pass it to the java command before any 
`--add-opens`.

> If you look at SPARK-24417 (Java 11), you can estimate what is coming to us.

It's certainly good to be careful. Our experience from the Scala compiler the 
Scala community build (which builds most of the Scala open source ecosystem) is 
that the Java 8 -> 9 upgrade was the most painful (module system). Everything 
after that (in particular 11 -> 17) was relatively simple.

I think the most common issue, as already noted above, is that JDK 16 changed 
to `--illegal-access=deny`. This could be worked around by setting 
`--illegal-access=permit`, however that option will be removed eventually. 
Adding the necessary `--add-opens` options is better longer-term. 
[https://www.oracle.com/java/technologies/javase/16-relnotes.html]


was (Author: lrytz):
> Could you test it with `--add-opens` please, Lukas Rytz?

I have not tested `-XX:+IgnoreUnrecognizedVMOptions` within the Spark build, 
and I don't know anything about how Java options are managed in Spark. But 
IgnoreUnrecognizedVMOptions has been around for a while, it also works on Java 
6. You need to make sure to pass it to the java command before any 
`--add-opens`.

 

> If you look at SPARK-24417 (Java 11), you can estimate what is coming to us.

It's certainly good to be careful. Our experience from the Scala compiler the 
Scala community build (which builds most of the Scala open source ecosystem) is 
that the Java 8 -> 9 upgrade was the most painful (module system). Everything 
after that (in particular 11 -> 17) was relatively simple.

I think the most common issue, as already noted above, is that JDK 16 changed 
to `--illegal-access=deny`. This could be worked around by setting 
`--illegal-access=permit`, however that option will be removed eventually. 
Adding the necessary `--add-opens` options is better longer-term. 
[https://www.oracle.com/java/technologies/javase/16-relnotes.html]

> Build and Run Spark on Java 17
> ------------------------------
>
>                 Key: SPARK-33772
>                 URL: https://issues.apache.org/jira/browse/SPARK-33772
>             Project: Spark
>          Issue Type: New Feature
>          Components: Build
>    Affects Versions: 3.3.0
>            Reporter: Dongjoon Hyun
>            Priority: Major
>
> Apache Spark supports Java 8 and Java 11 (LTS). The next Java LTS version is 
> 17.
> ||Version||Release Date||
> |Java 17 (LTS)|September 2021|
> Apache Spark has a release plan and `Spark 3.2 Code freeze` was July along 
> with the release branch cut.
> - https://spark.apache.org/versioning-policy.html
> Supporting new Java version is considered as a new feature which we cannot 
> allow to backport.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to