Hi.
> So the above issue occurs at build and test a maven project with Spark 3.3.0
> and Java 17, rather than test spark-3.3 source code?
Yes, that’s correct — it’s my project’s unit test that fails, not a Spark
source code unit test.
Sorry for the confusion - and thanks for the info about
: 2022年6月23日 星期四 14:11
收件人: "Yang,Jie(INF)"
抄送: "user@spark.apache.org"
主题: Re: [Java 17] --add-exports required?
Hi.
I am running on macOS 12.4, using an ‘Adoptium’ JDK from
https://adoptium.net/download<https://mailshield.baidu.com/check?q=U8F1V2tHFnSLZMX%2fpIYOpCo62
Hi.I am running on macOS 12.4, using an ‘Adoptium’ JDK from https://adoptium.net/download. The version details are:$ java -versionopenjdk version "17.0.3" 2022-04-19OpenJDK Runtime Environment Temurin-17.0.3+7 (build 17.0.3+7)OpenJDK 64-Bit Server VM Temurin-17.0.3+7 (build 17.0.3+7, mixed mode,
Hi, Greg
"--add-exports java.base/sun.nio.ch=ALL-UNNAMED " does not need to be added
when SPARK-33772 is completed, so in order to answer your question, I need more
details for testing:
1. Where can I download Java 17 (Temurin-17+35)?
2. What test commands do you use?
Yang Jie
在 2022/6/23
Hi.
According to the release notes[1], and specifically the ticket Build and Run
Spark on Java 17 (SPARK-33772)[2], Spark now supports running on Java 17.
However, using Java 17 (Temurin-17+35) with Maven (3.8.6) and
maven-surefire-plugin (3.0.0-M7), when running a unit test that uses Spark