[ https://issues.apache.org/jira/browse/SPARK-40022?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-40022: ------------------------------------ Assignee: Apache Spark > YarnClusterSuite should not ABORTED when there is no Python3 environment > ------------------------------------------------------------------------ > > Key: SPARK-40022 > URL: https://issues.apache.org/jira/browse/SPARK-40022 > Project: Spark > Issue Type: Improvement > Components: Tests > Affects Versions: 3.4.0 > Reporter: Yang Jie > Assignee: Apache Spark > Priority: Minor > > Execute "build/mvn clean test -pl resource-managers/yarn -Pyarn -Dtest=none > -DwildcardSuites=org.apache.spark.deploy.yarn.YarnClusterSuite -am" when the > python 3 running environment is not configured: > {code:java} > YarnClusterSuite: > org.apache.spark.deploy.yarn.YarnClusterSuite *** ABORTED *** > java.lang.RuntimeException: Unable to load a Suite class that was > discovered in the runpath: org.apache.spark.deploy.yarn.YarnClusterSuite > at > org.scalatest.tools.DiscoverySuite$.getSuiteInstance(DiscoverySuite.scala:81) > at > org.scalatest.tools.DiscoverySuite.$anonfun$nestedSuites$1(DiscoverySuite.scala:38) > at > scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) > at scala.collection.Iterator.foreach(Iterator.scala:943) > at scala.collection.Iterator.foreach$(Iterator.scala:943) > at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) > at scala.collection.IterableLike.foreach(IterableLike.scala:74) > at scala.collection.IterableLike.foreach$(IterableLike.scala:73) > at scala.collection.AbstractIterable.foreach(Iterable.scala:56) > at scala.collection.TraversableLike.map(TraversableLike.scala:286) > ... > Run completed in 833 milliseconds. > Total number of tests run: 0 > Suites: completed 1, aborted 1 > Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0 > *** 1 SUITE ABORTED *** > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org