This is an automated email from the ASF dual-hosted git repository. dongjoon pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push: new 92cabf6 [SPARK-28759][BUILD] Upgrade scala-maven-plugin to 4.2.0 and fix build profile on AppVeyor 92cabf6 is described below commit 92cabf63067b211a7cb06d27da8b67fbfe72c4d2 Author: HyukjinKwon <gurwls...@apache.org> AuthorDate: Fri Aug 30 09:39:15 2019 -0700 [SPARK-28759][BUILD] Upgrade scala-maven-plugin to 4.2.0 and fix build profile on AppVeyor ### What changes were proposed in this pull request? This PR proposes to upgrade scala-maven-plugin from 3.4.4 to 4.2.0. Upgrade to 4.1.1 was reverted due to unexpected build failure on AppVeyor. The root cause seems to be an issue specific to AppVeyor - loading the system library 'kernel32.dll' seems being failed. ``` Suppressed: java.lang.NoClassDefFoundError: Could not initialize class com.sun.jna.platform.win32.Kernel32 at sbt.internal.io.WinMilli$.getHandle(Milli.scala:264) at sbt.internal.io.WinMilli$.getModifiedTimeNative(Milli.scala:289) at sbt.internal.io.WinMilli$.getModifiedTimeNative(Milli.scala:260) at sbt.internal.io.MilliNative.getModifiedTime(Milli.scala:61) at sbt.internal.io.Milli$.getModifiedTime(Milli.scala:360) at sbt.io.IO$.$anonfun$getModifiedTimeOrZero$1(IO.scala:1373) at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) at sbt.internal.io.Retry$.liftedTree2$1(Retry.scala:38) at sbt.internal.io.Retry$.impl$1(Retry.scala:38) at sbt.internal.io.Retry$.apply(Retry.scala:52) at sbt.internal.io.Retry$.apply(Retry.scala:24) at sbt.io.IO$.getModifiedTimeOrZero(IO.scala:1373) at sbt.internal.inc.caching.ClasspathCache$.fromCacheOrHash$1(ClasspathCache.scala:44) at sbt.internal.inc.caching.ClasspathCache$.$anonfun$hashClasspath$1(ClasspathCache.scala:53) at scala.collection.parallel.mutable.ParArray$Map.leaf(ParArray.scala:659) at scala.collection.parallel.Task.$anonfun$tryLeaf$1(Tasks.scala:53) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at scala.util.control.Breaks$$anon$1.catchBreak(Breaks.scala:67) at scala.collection.parallel.Task.tryLeaf(Tasks.scala:56) at scala.collection.parallel.Task.tryLeaf$(Tasks.scala:50) at scala.collection.parallel.mutable.ParArray$Map.tryLeaf(ParArray.scala:650) at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask.internal(Tasks.scala:170) ... 25 more ``` By setting `-Djna.nosys=true`, it directly loads the library from the jar instead of system's. In this way, the build seems working fine. ### Why are the changes needed? It upgrades the plugin to fix bugs and fixes the CI build. ### Does this PR introduce any user-facing change? No. ### How was this patch tested? It was tested at https://github.com/apache/spark/pull/25497 Closes #25633 from HyukjinKwon/SPARK-28759. Authored-by: HyukjinKwon <gurwls...@apache.org> Signed-off-by: Dongjoon Hyun <dh...@apple.com> --- appveyor.yml | 4 +++- pom.xml | 2 +- 2 files changed, 4 insertions(+), 2 deletions(-) diff --git a/appveyor.yml b/appveyor.yml index b0e946c..a61436c 100644 --- a/appveyor.yml +++ b/appveyor.yml @@ -51,7 +51,9 @@ install: - cmd: R -e "packageVersion('knitr'); packageVersion('rmarkdown'); packageVersion('testthat'); packageVersion('e1071'); packageVersion('survival')" build_script: - - cmd: mvn -DskipTests -Psparkr -Phive package + # '-Djna.nosys=true' is required to avoid kernel32.dll load failure. + # See SPARK-28759. + - cmd: mvn -DskipTests -Psparkr -Phive -Djna.nosys=true package environment: NOT_CRAN: true diff --git a/pom.xml b/pom.xml index 0288c6f..6544b50 100644 --- a/pom.xml +++ b/pom.xml @@ -2295,7 +2295,7 @@ <plugin> <groupId>net.alchim31.maven</groupId> <artifactId>scala-maven-plugin</artifactId> - <version>3.4.4</version> + <version>4.2.0</version> <executions> <execution> <id>eclipse-add-source</id> --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org