[ https://issues.apache.org/jira/browse/SPARK-35687?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-35687. ---------------------------------- Fix Version/s: 3.0.3 3.1.3 3.2.0 Resolution: Fixed Issue resolved by pull request 32833 [https://github.com/apache/spark/pull/32833] > PythonUDFSuite move assume into its methods > ------------------------------------------- > > Key: SPARK-35687 > URL: https://issues.apache.org/jira/browse/SPARK-35687 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 3.2.0 > Reporter: XiDuo You > Assignee: XiDuo You > Priority: Minor > Fix For: 3.2.0, 3.1.3, 3.0.3 > > > When we run Spark test with such command: > `./build/mvn -Phadoop-2.7 -Phive -Phive-thriftserver -Pyarn -Pkubernetes > clean test` > get this exception: > {code:java} > PythonUDFSuite: > org.apache.spark.sql.execution.python.PythonUDFSuite *** ABORTED *** > java.lang.RuntimeException: Unable to load a Suite class that was > discovered in the runpath: > org.apache.spark.sql.execution.python.PythonUDFSuite > at > org.scalatest.tools.DiscoverySuite$.getSuiteInstance(DiscoverySuite.scala:81) > at > org.scalatest.tools.DiscoverySuite.$anonfun$nestedSuites$1(DiscoverySuite.scala:38) > at > scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238) > at scala.collection.Iterator.foreach(Iterator.scala:941) > at scala.collection.Iterator.foreach$(Iterator.scala:941) > at scala.collection.AbstractIterator.foreach(Iterator.scala:1429) > at scala.collection.IterableLike.foreach(IterableLike.scala:74) > at scala.collection.IterableLike.foreach$(IterableLike.scala:73) > at scala.collection.AbstractIterable.foreach(Iterable.scala:56) > at scala.collection.TraversableLike.map(TraversableLike.scala:238) > {code} > The test env has not PYSpark module so it failed. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org