[ https://issues.apache.org/jira/browse/SPARK-34507?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17298889#comment-17298889 ]
Vincenzo commented on SPARK-34507: ---------------------------------- Hello! [~smarter] has been helping me with a Spark 3.2 project using Scala 2.13 and Scala 3. I put together a small project showing the issue. You can find it at [https://github.com/vincenzobaz/spark-scala3] In particular, you can see the failure [here|https://github.com/vincenzobaz/spark-scala3/runs/2077805484?check_suite_focus=true#step:5:61] while you can see the wrong reflect library pulled [here|https://github.com/vincenzobaz/spark-scala3/runs/2077805484?check_suite_focus=true#step:4:937] > Spark artefacts built against Scala 2.13 incorrectly depend on Scala 2.12 > ------------------------------------------------------------------------- > > Key: SPARK-34507 > URL: https://issues.apache.org/jira/browse/SPARK-34507 > Project: Spark > Issue Type: Sub-task > Components: Build > Affects Versions: 3.2.0 > Reporter: Guillaume Martres > Priority: Major > > Snapshots of Spark 3.2 built against Scala 2.13 are available at > [https://repository.apache.org/content/repositories/snapshots/org/apache/spark/,] > but they seem to depend on Scala 2.12. Specifically if I look at > [https://repository.apache.org/content/repositories/snapshots/org/apache/spark/spark-parent_2.13/3.2.0-SNAPSHOT/spark-parent_2.13-3.2.0-20210223.010629-29.pom] > I see: > {code:java} > <scala.version>2.12.10</scala.version> > <scala.binary.version>2.13</scala.binary.version{code} > It looks like > [https://github.com/apache/spark/blob/8f994cbb4a18558c2e81516ef1e339d9c8fa0d41/dev/change-scala-version.sh#L65] > needs to be updated to also change the `scala.version` and not just the > `scala.binary.version`. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org