[ https://issues.apache.org/jira/browse/SPARK-37302?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Kousuke Saruta updated SPARK-37302: ----------------------------------- Description: dev/run-tests.py fails if Scala 2.13 is used and guava or jetty-io is not in the both of Maven and Coursier local repository. {code:java} $ rm -rf ~/.m2/repository/* $ # For Linux $ rm -rf ~/.cache/coursier/v1/* $ # For macOS $ rm -rf ~/Library/Caches/Coursier/v1/* $ dev/change-scala-version.sh 2.13 $ dev/test-dependencies.sh $ build/sbt -Pscala-2.13 clean compile ... [error] /home/kou/work/oss/spark-scala-2.13/common/network-common/src/main/java/org/apache/spark/network/util/TransportConf.java:24:1: error: package com.google.common.primitives does not exist [error] import com.google.common.primitives.Ints; [error] ^ [error] /home/kou/work/oss/spark-scala-2.13/common/network-common/src/main/java/org/apache/spark/network/client/TransportClientFactory.java:30:1: error: package com.google.common.annotations does not exist [error] import com.google.common.annotations.VisibleForTesting; [error] ^ [error] /home/kou/work/oss/spark-scala-2.13/common/network-common/src/main/java/org/apache/spark/network/client/TransportClientFactory.java:31:1: error: package com.google.common.base does not exist [error] import com.google.common.base.Preconditions; ... {code} {code:java} [error] /home/kou/work/oss/spark-scala-2.13/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionServer.scala:87:25: Class org.eclipse.jetty.io.ByteBufferPool not found - continuing with a stub. [error] val connector = new ServerConnector( [error] ^ [error] /home/kou/work/oss/spark-scala-2.13/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionServer.scala:87:21: multiple constructors for ServerConnector with alternatives: [error] (x$1: org.eclipse.jetty.server.Server,x$2: java.util.concurrent.Executor,x$3: org.eclipse.jetty.util.thread.Scheduler,x$4: org.eclipse.jetty.io.ByteBufferPool,x$5: Int,x$6: Int,x$7: org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector <and> [error] (x$1: org.eclipse.jetty.server.Server,x$2: org.eclipse.jetty.util.ssl.SslContextFactory,x$3: org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector <and> [error] (x$1: org.eclipse.jetty.server.Server,x$2: org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector <and> [error] (x$1: org.eclipse.jetty.server.Server,x$2: Int,x$3: Int,x$4: org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector [error] cannot be invoked with (org.eclipse.jetty.server.Server, Null, org.eclipse.jetty.util.thread.ScheduledExecutorScheduler, Null, Int, Int, org.eclipse.jetty.server.HttpConnectionFactory) [error] val connector = new ServerConnector( [error] ^ [error] /home/kou/work/oss/spark-scala-2.13/core/src/main/scala/org/apache/spark/ui/JettyUtils.scala:207:13: Class org.eclipse.jetty.io.ClientConnectionFactory not found - continuing with a stub. [error] new HttpClient(new HttpClientTransportOverHTTP(numSelectors), null) [error] ^ [error] /home/kou/work/oss/spark-scala-2.13/core/src/main/scala/org/apache/spark/ui/JettyUtils.scala:287:25: multiple constructors for ServerConnector with alternatives: [error] (x$1: org.eclipse.jetty.server.Server,x$2: java.util.concurrent.Executor,x$3: org.eclipse.jetty.util.thread.Scheduler,x$4: org.eclipse.jetty.io.ByteBufferPool,x$5: Int,x$6: Int,x$7: org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector <and> [error] (x$1: org.eclipse.jetty.server.Server,x$2: org.eclipse.jetty.util.ssl.SslContextFactory,x$3: org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector <and> [error] (x$1: org.eclipse.jetty.server.Server,x$2: org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector <and> [error] (x$1: org.eclipse.jetty.server.Server,x$2: Int,x$3: Int,x$4: org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector [error] cannot be invoked with (org.eclipse.jetty.server.Server, Null, org.eclipse.jetty.util.thread.ScheduledExecutorScheduler, Null, Int, Int, org.eclipse.jetty.server.ConnectionFactory) [error] val connector = new ServerConnector( {code} The reason is that exec-maven-plugin used in test-dependencies.sh downloads pom of guava and jetty-io but doesn't downloads the corresponding jars, and skip dependency testing if Scala 2.13 is used (if dependency testing runs, Maven downloads those jars). {code} if [[ "$SCALA_BINARY_VERSION" != "2.12" ]]; then # TODO(SPARK-36168) Support Scala 2.13 in dev/test-dependencies.sh echo "Skip dependency testing on $SCALA_BINARY_VERSION" exit 0 fi {code} {code:java} $ find ~/.m2 -name "guava*" ... /home/kou/.m2/repository/com/google/guava/guava/14.0.1/guava-14.0.1.pom /home/kou/.m2/repository/com/google/guava/guava/14.0.1/guava-14.0.1.pom.sha1 ... /home/kou/.m2/repository/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom /home/kou/.m2/repository/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom.sha1 ... $ find ~/.m2 -name "jetty*" ... /home/kou/.m2/repository/org/eclipse/jetty/jetty-io/9.4.43.v20210629/jetty-io-9.4.43.v20210629.pom /home/kou/.m2/repository/org/eclipse/jetty/jetty-io/9.4.43.v20210629/jetty-io-9.4.43.v20210629.pom.sha1 ... {code} Under the circumstances, building Spark using SBT fails. run-tests.py builds Spark using SBT after the dependency testing so run-tests.py fails with Scala 2.13. Further, I noticed that this issue can even happen with Sala 2.12 if the script exits at the following part. {code} if [ $? != 0 ]; then echo -e "Error while getting version string from Maven:\n$OLD_VERSION" exit 1 fi {code} This phenomenon is similar to SPARK-34762. was: dev/run-tests.py fails if Scala 2.13 is used and guava or jetty-io is not in the both of Maven and Coursier local repository. {code:java} $ rm -rf ~/.m2/repository/* $ # For Linux $ rm -rf ~/.cache/coursier/v1/* $ # For macOS $ rm -rf ~/Library/Caches/Coursier/v1/* $ dev/change-scala-version.sh 2.13 $ dev/test-dependencies.sh $ build/sbt -Pscala-2.13 clean compile ... [error] /home/kou/work/oss/spark-scala-2.13/common/network-common/src/main/java/org/apache/spark/network/util/TransportConf.java:24:1: error: package com.google.common.primitives does not exist [error] import com.google.common.primitives.Ints; [error] ^ [error] /home/kou/work/oss/spark-scala-2.13/common/network-common/src/main/java/org/apache/spark/network/client/TransportClientFactory.java:30:1: error: package com.google.common.annotations does not exist [error] import com.google.common.annotations.VisibleForTesting; [error] ^ [error] /home/kou/work/oss/spark-scala-2.13/common/network-common/src/main/java/org/apache/spark/network/client/TransportClientFactory.java:31:1: error: package com.google.common.base does not exist [error] import com.google.common.base.Preconditions; ... {code} {code:java} [error] /home/kou/work/oss/spark-scala-2.13/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionServer.scala:87:25: Class org.eclipse.jetty.io.ByteBufferPool not found - continuing with a stub. [error] val connector = new ServerConnector( [error] ^ [error] /home/kou/work/oss/spark-scala-2.13/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionServer.scala:87:21: multiple constructors for ServerConnector with alternatives: [error] (x$1: org.eclipse.jetty.server.Server,x$2: java.util.concurrent.Executor,x$3: org.eclipse.jetty.util.thread.Scheduler,x$4: org.eclipse.jetty.io.ByteBufferPool,x$5: Int,x$6: Int,x$7: org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector <and> [error] (x$1: org.eclipse.jetty.server.Server,x$2: org.eclipse.jetty.util.ssl.SslContextFactory,x$3: org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector <and> [error] (x$1: org.eclipse.jetty.server.Server,x$2: org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector <and> [error] (x$1: org.eclipse.jetty.server.Server,x$2: Int,x$3: Int,x$4: org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector [error] cannot be invoked with (org.eclipse.jetty.server.Server, Null, org.eclipse.jetty.util.thread.ScheduledExecutorScheduler, Null, Int, Int, org.eclipse.jetty.server.HttpConnectionFactory) [error] val connector = new ServerConnector( [error] ^ [error] /home/kou/work/oss/spark-scala-2.13/core/src/main/scala/org/apache/spark/ui/JettyUtils.scala:207:13: Class org.eclipse.jetty.io.ClientConnectionFactory not found - continuing with a stub. [error] new HttpClient(new HttpClientTransportOverHTTP(numSelectors), null) [error] ^ [error] /home/kou/work/oss/spark-scala-2.13/core/src/main/scala/org/apache/spark/ui/JettyUtils.scala:287:25: multiple constructors for ServerConnector with alternatives: [error] (x$1: org.eclipse.jetty.server.Server,x$2: java.util.concurrent.Executor,x$3: org.eclipse.jetty.util.thread.Scheduler,x$4: org.eclipse.jetty.io.ByteBufferPool,x$5: Int,x$6: Int,x$7: org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector <and> [error] (x$1: org.eclipse.jetty.server.Server,x$2: org.eclipse.jetty.util.ssl.SslContextFactory,x$3: org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector <and> [error] (x$1: org.eclipse.jetty.server.Server,x$2: org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector <and> [error] (x$1: org.eclipse.jetty.server.Server,x$2: Int,x$3: Int,x$4: org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector [error] cannot be invoked with (org.eclipse.jetty.server.Server, Null, org.eclipse.jetty.util.thread.ScheduledExecutorScheduler, Null, Int, Int, org.eclipse.jetty.server.ConnectionFactory) [error] val connector = new ServerConnector( {code} The reason is that exec-maven-plugin used in test-dependencies.sh downloads pom of guava and jetty-io but doesn't downloads the corresponding jars, and skip dependency testing if Scala 2.13 is used (if dependency testing runs, Maven downloads those jars). {code} if [[ "$SCALA_BINARY_VERSION" != "2.12" ]]; then # TODO(SPARK-36168) Support Scala 2.13 in dev/test-dependencies.sh echo "Skip dependency testing on $SCALA_BINARY_VERSION" exit 0 fi {code} {code:java} $ find ~/.m2 -name "guava*" ... /home/kou/.m2/repository/com/google/guava/guava/14.0.1/guava-14.0.1.pom /home/kou/.m2/repository/com/google/guava/guava/14.0.1/guava-14.0.1.pom.sha1 ... /home/kou/.m2/repository/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom /home/kou/.m2/repository/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom.sha1 ... $ find ~/.m2 -name "jetty*" ... /home/kou/.m2/repository/org/eclipse/jetty/jetty-io/9.4.43.v20210629/jetty-io-9.4.43.v20210629.pom /home/kou/.m2/repository/org/eclipse/jetty/jetty-io/9.4.43.v20210629/jetty-io-9.4.43.v20210629.pom.sha1 ... {code} Under the circumstances, building Spark using SBT fails. `run-tests.py` builds Spark using SBT after the dependency testing so `run-tests.py` fails with Scala 2.13. Further, I noticed that this issue can even happen with Sala 2.12 if the script exits at the following part. {code} if [ $? != 0 ]; then echo -e "Error while getting version string from Maven:\n$OLD_VERSION" exit 1 fi {code} This phenomenon is similar to SPARK-34762. > Explicitly download the dependencies of guava and jetty-io in > test-dependencies.sh > ---------------------------------------------------------------------------------- > > Key: SPARK-37302 > URL: https://issues.apache.org/jira/browse/SPARK-37302 > Project: Spark > Issue Type: Bug > Components: Build > Affects Versions: 3.2.0 > Reporter: Kousuke Saruta > Assignee: Kousuke Saruta > Priority: Major > > dev/run-tests.py fails if Scala 2.13 is used and guava or jetty-io is not in > the both of Maven and Coursier local repository. > {code:java} > $ rm -rf ~/.m2/repository/* > $ # For Linux > $ rm -rf ~/.cache/coursier/v1/* > $ # For macOS > $ rm -rf ~/Library/Caches/Coursier/v1/* > $ dev/change-scala-version.sh 2.13 > $ dev/test-dependencies.sh > $ build/sbt -Pscala-2.13 clean compile > ... > [error] > /home/kou/work/oss/spark-scala-2.13/common/network-common/src/main/java/org/apache/spark/network/util/TransportConf.java:24:1: > error: package com.google.common.primitives does not exist > [error] import com.google.common.primitives.Ints; > [error] ^ > [error] > /home/kou/work/oss/spark-scala-2.13/common/network-common/src/main/java/org/apache/spark/network/client/TransportClientFactory.java:30:1: > error: package com.google.common.annotations does not exist > [error] import com.google.common.annotations.VisibleForTesting; > [error] ^ > [error] > /home/kou/work/oss/spark-scala-2.13/common/network-common/src/main/java/org/apache/spark/network/client/TransportClientFactory.java:31:1: > error: package com.google.common.base does not exist > [error] import com.google.common.base.Preconditions; > ... > {code} > {code:java} > [error] > /home/kou/work/oss/spark-scala-2.13/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionServer.scala:87:25: > Class org.eclipse.jetty.io.ByteBufferPool not found - continuing with a stub. > [error] val connector = new ServerConnector( > [error] ^ > [error] > /home/kou/work/oss/spark-scala-2.13/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionServer.scala:87:21: > multiple constructors for ServerConnector with alternatives: > [error] (x$1: org.eclipse.jetty.server.Server,x$2: > java.util.concurrent.Executor,x$3: > org.eclipse.jetty.util.thread.Scheduler,x$4: > org.eclipse.jetty.io.ByteBufferPool,x$5: Int,x$6: Int,x$7: > org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector > <and> > [error] (x$1: org.eclipse.jetty.server.Server,x$2: > org.eclipse.jetty.util.ssl.SslContextFactory,x$3: > org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector > <and> > [error] (x$1: org.eclipse.jetty.server.Server,x$2: > org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector > <and> > [error] (x$1: org.eclipse.jetty.server.Server,x$2: Int,x$3: Int,x$4: > org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector > [error] cannot be invoked with (org.eclipse.jetty.server.Server, Null, > org.eclipse.jetty.util.thread.ScheduledExecutorScheduler, Null, Int, Int, > org.eclipse.jetty.server.HttpConnectionFactory) > [error] val connector = new ServerConnector( > [error] ^ > [error] > /home/kou/work/oss/spark-scala-2.13/core/src/main/scala/org/apache/spark/ui/JettyUtils.scala:207:13: > Class org.eclipse.jetty.io.ClientConnectionFactory not found - continuing > with a stub. > [error] new HttpClient(new HttpClientTransportOverHTTP(numSelectors), > null) > [error] ^ > [error] > /home/kou/work/oss/spark-scala-2.13/core/src/main/scala/org/apache/spark/ui/JettyUtils.scala:287:25: > multiple constructors for ServerConnector with alternatives: > [error] (x$1: org.eclipse.jetty.server.Server,x$2: > java.util.concurrent.Executor,x$3: > org.eclipse.jetty.util.thread.Scheduler,x$4: > org.eclipse.jetty.io.ByteBufferPool,x$5: Int,x$6: Int,x$7: > org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector > <and> > [error] (x$1: org.eclipse.jetty.server.Server,x$2: > org.eclipse.jetty.util.ssl.SslContextFactory,x$3: > org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector > <and> > [error] (x$1: org.eclipse.jetty.server.Server,x$2: > org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector > <and> > [error] (x$1: org.eclipse.jetty.server.Server,x$2: Int,x$3: Int,x$4: > org.eclipse.jetty.server.ConnectionFactory*)org.eclipse.jetty.server.ServerConnector > [error] cannot be invoked with (org.eclipse.jetty.server.Server, Null, > org.eclipse.jetty.util.thread.ScheduledExecutorScheduler, Null, Int, Int, > org.eclipse.jetty.server.ConnectionFactory) > [error] val connector = new ServerConnector( > {code} > The reason is that exec-maven-plugin used in test-dependencies.sh downloads > pom of guava and jetty-io but doesn't downloads the corresponding jars, and > skip dependency testing if Scala 2.13 is used (if dependency testing runs, > Maven downloads those jars). > {code} > if [[ "$SCALA_BINARY_VERSION" != "2.12" ]]; then > # TODO(SPARK-36168) Support Scala 2.13 in dev/test-dependencies.sh > echo "Skip dependency testing on $SCALA_BINARY_VERSION" > exit 0 > fi > {code} > {code:java} > $ find ~/.m2 -name "guava*" > ... > /home/kou/.m2/repository/com/google/guava/guava/14.0.1/guava-14.0.1.pom > /home/kou/.m2/repository/com/google/guava/guava/14.0.1/guava-14.0.1.pom.sha1 > ... > /home/kou/.m2/repository/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom > /home/kou/.m2/repository/com/google/guava/guava-parent/14.0.1/guava-parent-14.0.1.pom.sha1 > ... > $ find ~/.m2 -name "jetty*" > ... > /home/kou/.m2/repository/org/eclipse/jetty/jetty-io/9.4.43.v20210629/jetty-io-9.4.43.v20210629.pom > /home/kou/.m2/repository/org/eclipse/jetty/jetty-io/9.4.43.v20210629/jetty-io-9.4.43.v20210629.pom.sha1 > ... > {code} > Under the circumstances, building Spark using SBT fails. > run-tests.py builds Spark using SBT after the dependency testing so > run-tests.py fails with Scala 2.13. > Further, I noticed that this issue can even happen with Sala 2.12 if the > script exits at the following part. > {code} > if [ $? != 0 ]; then > echo -e "Error while getting version string from Maven:\n$OLD_VERSION" > exit 1 > fi > {code} > This phenomenon is similar to SPARK-34762. -- This message was sent by Atlassian Jira (v8.20.1#820001) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org