[ 
https://issues.apache.org/jira/browse/SPARK-34746?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean R. Owen resolved SPARK-34746.
----------------------------------
    Resolution: Duplicate

Yep, though there are reasons we couldn't use 2.12.12

> Spark dependencies require scala 2.12.12
> ----------------------------------------
>
>                 Key: SPARK-34746
>                 URL: https://issues.apache.org/jira/browse/SPARK-34746
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.1.1
>            Reporter: Peter Kaiser
>            Priority: Critical
>
> In our application we're creating a spark session programmatically. The 
> application is built using gradle.
> After upgrading spark to 3.1.1 it no longer works, due to incompatible 
> classes on driver and executor (namely: 
> scala.lang.collections.immutable.WrappedArray.ofRef).
> Turns out this was caused by different scala versions on driver vs. executor. 
> While spark still comes with Scala 2.12.10, some of its dependencies in the 
> gradle build require Scala 2.12.12:
> {noformat}
> Cannot find a version of 'org.scala-lang:scala-library' that satisfies the 
> version constraints:
> Dependency path '...' --> '...' --> 'org.scala-lang:scala-library:{strictly 
> 2.12.10}'
> Dependency path '...' --> 'org.apache.spark:spark-core_2.12:3.1.1' --> 
> 'org.json4s:json4s-jackson_2.12:3.7.0-M5' --> 
> 'org.scala-lang:scala-library:2.12.12' {noformat}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to