Thank you all so much for the kind words of encouragement on my first test
report.  As a follow up, I ran all my HDFS and Yarn nodes on Java 8 -
including my Nodemanagers.  I then modified Spark's
conf/spark-defaults.conf according to Mr. Pan's prior post, and it worked:
I was able to submit SparkPi and my PySpark code using 4.0.0-preview1 to
Yarn, successfully deploying in both client and cluster mode.  Without the
changes, Yarn would have otherwise thrown an Unsupported Class Version
Error about org/apache/spark/deploy/yarn/ExecutorLauncher.  George

On Tue, Jun 18, 2024 at 6:26 AM Cheng Pan <pan3...@gmail.com> wrote:

> FYI, I have submitted SPARK-48651(
> https://github.com/apache/spark/pull/47010) to update the Spark on YARN
> docs for JDK configuration, looking forward to your feedback.
>
> Thanks,
> Cheng Pan
>
>
> On Jun 18, 2024, at 02:00, George Magiros <gmagi...@gmail.com> wrote:
>
> I successfully submitted and ran org.apache.spark.examples.SparkPi on Yarn
> using 4.0.0-preview1.  However I got it to work only after fixing an issue
> with the Yarn nodemanagers (Hadoop v3.3.6 and v3.4.0).  Namely the issue
> was:
> 1. If the nodemanagers used java 11, Yarn threw an error about not finding
> the jdk.incubator.vector module.
> 2. If the nodemanagers used java 17, which has the jdk.incubator.vector
> module, Yarn threw a reflection error about class not found.
>
> To resolve the error and successfully calculate pi,
> 1. I ran java 17 on the nodemanagers and
> 2. added 'export
> HADOOP_OPTS="--add-opens=java.base/java.lang=ALL-UNNAMED"' to their
> conf/hadoop-env.sh file.
>
> George
>
>
>

Reply via email to