Cool! I restarted the build.
Best
--
Zoi
Στις Πέμπτη 11 Ιανουαρίου 2024 στις 02:17:11 μ.μ. CET, ο χρήστης Juri
Petersen <[email protected]> έγραψε:
Hi Zoi,
the spark version itself was not changed, but the support for Scala 2.11 and
java 8 was removed.
However, locally I managed to fix the problem by explicitly adding "2.12" in
artifactIds of dependencies in the "wayang-tests-integration" pom.xml.
I hope that this is also reflected in a successful build in the PR when the
action is started.
Best,
Juri
________________________________
From: Zoi Kaoudi <[email protected]>
Sent: 11 January 2024 14:12
To: [email protected] <[email protected]>
Subject: Re: Integration test failure caused by Spark
[You don't often get email from [email protected]. Learn why this is
important at https://aka.ms/LearnAboutSenderIdentification ]
Hi Juri,
just to better understand: was this test failing before? Did you change the
Spark version?
Best
--
Zoi
Στις Πέμπτη 11 Ιανουαρίου 2024 στις 11:39:19 π.μ. CET, ο χρήστης Juri
Petersen <[email protected]> έγραψε:
Hello,
in the process of finishing the PR
https://github.com/apache/incubator-wayang/pull/389 for issue
https://github.com/apache/incubator-wayang/issues/28<https://github.com/apache/incubator-wayang/issues/283>3,
I encountered failures when running the integration tests in
"wayang-tests-integration".
When verifying the build, I get the following error message during the
execution of tests:
[ERROR] org.apache.wayang.tests.WordCountIT.testOnSparkToJava Time elapsed:
0.03 s <<< ERROR!
org.apache.wayang.core.api.exception.WayangException: Job execution failed.
at
org.apache.wayang.tests.WordCountIT.testOnSparkToJava(WordCountIT.java:281)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class
org.apache.spark.storage.StorageUtils$
at
org.apache.wayang.tests.WordCountIT.testOnSparkToJava(WordCountIT.java:281)
>From resources like
>https://java.tutorialink.com/running-unit-tests-with-spark-3-3-0-on-java-17-fails-with-illegalaccesserror-class-storageutils-cannot-access-class-sun-nio-ch-directbuffer/,
> I can see that this is likely due to the configuration of Spark and Java 11.
>The mentioned export flags for the "maven-surefire-plugin" didn't solve the
>problem.
Has anyone experienced a similar error or even this exact error and knows how
to configure the build to avoid it?
Thanks in advance,
Juri Petersen