[ 
https://issues.apache.org/jira/browse/SPARK-34507?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17298970#comment-17298970
 ] 

Guillaume Martres commented on SPARK-34507:
-------------------------------------------

[~srowen] This issue is about the artifacts published in 
[https://repository.apache.org/content/repositories/snapshots/org/apache/spark/,]
 see 
[http://apache-spark-developers-list.1001551.n3.nabble.com/FYI-Scala-2-13-Maven-Artifacts-td30616.html|http://apache-spark-developers-list.1001551.n3.nabble.com/FYI-Scala-2-13-Maven-Artifacts-td30616.html,]

These appears to be published after running the script that modifies the POMs 
but my contention is that this script is incomplete and should also update 
`scala.version`

> Spark artefacts built against Scala 2.13 incorrectly depend on Scala 2.12
> -------------------------------------------------------------------------
>
>                 Key: SPARK-34507
>                 URL: https://issues.apache.org/jira/browse/SPARK-34507
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Build
>    Affects Versions: 3.2.0
>            Reporter: Guillaume Martres
>            Priority: Major
>
> Snapshots of Spark 3.2 built against Scala 2.13 are available at 
> [https://repository.apache.org/content/repositories/snapshots/org/apache/spark/,]
>  but they seem to depend on Scala 2.12. Specifically if I look at 
> [https://repository.apache.org/content/repositories/snapshots/org/apache/spark/spark-parent_2.13/3.2.0-SNAPSHOT/spark-parent_2.13-3.2.0-20210223.010629-29.pom]
>  I see:
> {code:java}
> <scala.version>2.12.10</scala.version>
> <scala.binary.version>2.13</scala.binary.version{code}
> It looks like 
> [https://github.com/apache/spark/blob/8f994cbb4a18558c2e81516ef1e339d9c8fa0d41/dev/change-scala-version.sh#L65]
>  needs to be updated to also change the `scala.version` and not just the 
> `scala.binary.version`.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to