[ https://issues.apache.org/jira/browse/SPARK-34507?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17291295#comment-17291295 ]
Dongjoon Hyun commented on SPARK-34507: --------------------------------------- Hi, [~smarter]. When Apache Spark 2.4 publishes both Scala 2.12/2.11, we have been using the same way. - [https://repo1.maven.org/maven2/org/apache/spark/spark-parent_2.12/2.4.7/spark-parent_2.12-2.4.7.pom] Like [~LuciferYang] mentioned, from Spark side, we use `-Pscala-2.13` always and there is no problem. So, the question is that, from your application side, are you really facing some issues when you are pulling Scala 2.13 artifacts from the Maven? Then, could you describe the symptom? > Spark artefacts built against Scala 2.13 incorrectly depend on Scala 2.12 > ------------------------------------------------------------------------- > > Key: SPARK-34507 > URL: https://issues.apache.org/jira/browse/SPARK-34507 > Project: Spark > Issue Type: Sub-task > Components: Build > Affects Versions: 3.2.0 > Reporter: Guillaume Martres > Priority: Major > > Snapshots of Spark 3.2 built against Scala 2.13 are available at > [https://repository.apache.org/content/repositories/snapshots/org/apache/spark/,] > but they seem to depend on Scala 2.12. Specifically if I look at > [https://repository.apache.org/content/repositories/snapshots/org/apache/spark/spark-parent_2.13/3.2.0-SNAPSHOT/spark-parent_2.13-3.2.0-20210223.010629-29.pom] > I see: > {code:java} > <scala.version>2.12.10</scala.version> > <scala.binary.version>2.13</scala.binary.version{code} > It looks like > [https://github.com/apache/spark/blob/8f994cbb4a18558c2e81516ef1e339d9c8fa0d41/dev/change-scala-version.sh#L65] > needs to be updated to also change the `scala.version` and not just the > `scala.binary.version`. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org