[ 
https://issues.apache.org/jira/browse/SPARK-35838?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17366418#comment-17366418
 ] 

Apache Spark commented on SPARK-35838:
--------------------------------------

User 'LuciferYang' has created a pull request for this issue:
https://github.com/apache/spark/pull/32994

> Ensure kafka-0-10-sql module can maven test independently in Scala 2.13
> -----------------------------------------------------------------------
>
>                 Key: SPARK-35838
>                 URL: https://issues.apache.org/jira/browse/SPARK-35838
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Build
>    Affects Versions: 3.2.0
>            Reporter: Yang Jie
>            Priority: Minor
>
>  
> Execute
>  
> {code:java}
> mvn clean install -Phadoop-3.2 -Phive-2.3 -Phadoop-cloud -Pmesos -Pyarn 
> -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive 
> -Pscala-2.13 -pl external/kafka-0-10-sql
> {code}
>  
> 1 scala test aborted, the error message is 
> {code:java}
> Discovery starting.
> Discovery completed in 857 milliseconds.
> Run starting. Expected test count is: 464
> ...
> KafkaRelationSuiteV2:
> - explicit earliest to latest offsets
> - default starting and ending offsets
> - explicit offsets
> - default starting and ending offsets with headers
> - timestamp provided for starting and ending
> - timestamp provided for starting, offset provided for ending
> - timestamp provided for ending, offset provided for starting
> - timestamp provided for starting, ending not provided
> - timestamp provided for ending, starting not provided
> - global timestamp provided for starting and ending
> - no matched offset for timestamp - startingOffsets
> - preferences on offset related options
> - no matched offset for timestamp - endingOffsets
> *** RUN ABORTED ***
>   java.lang.NoClassDefFoundError: scala/collection/parallel/TaskSupport
>   at org.apache.spark.SparkContext.$anonfun$union$1(SparkContext.scala:1411)
>   at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>   at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
>   at org.apache.spark.SparkContext.withScope(SparkContext.scala:788)
>   at org.apache.spark.SparkContext.union(SparkContext.scala:1405)
>   at 
> org.apache.spark.sql.execution.UnionExec.doExecute(basicPhysicalOperators.scala:697)
>   at 
> org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:182)
>   at 
> org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:220)
>   at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>   at 
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:217)
>   ...
>   Cause: java.lang.ClassNotFoundException: 
> scala.collection.parallel.TaskSupport
>   at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
>   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
>   at org.apache.spark.SparkContext.$anonfun$union$1(SparkContext.scala:1411)
>   at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>   at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
>   at org.apache.spark.SparkContext.withScope(SparkContext.scala:788)
>   at org.apache.spark.SparkContext.union(SparkContext.scala:1405)
>   at 
> org.apache.spark.sql.execution.UnionExec.doExecute(basicPhysicalOperators.scala:697)
>   ...
> {code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to