Hi,

Just found this paragraph in
http://spark.apache.org/docs/2.4.6/index.html#downloading:

"Spark runs on Java 8, Python 2.7+/3.4+ and R 3.1+. For the Scala API,
Spark 2.4.6 uses Scala 2.12. You will need to use a compatible Scala
version (2.12.x)."

That seems to contradict the version of Scala in the pom.xml [1] which
is 2.11.12. I think this says that Spark 2.4.6 uses Scala 2.12 by default
which is incorrect to me. Am I missing something?

My question is what's the official Scala version of Spark 2.4.6 (and others
in 2.4.x release line)?

(I do know that Spark 2.4.x could be compiled with Scala 2.12, but that
requires scala-2.12 profile [2] to be enabled)

[1] https://github.com/apache/spark/blob/v2.4.6/pom.xml#L158
[2] https://github.com/apache/spark/blob/v2.4.6/pom.xml#L2830

Pozdrawiam,
Jacek Laskowski
----
https://about.me/JacekLaskowski
"The Internals Of" Online Books <https://books.japila.pl/>
Follow me on https://twitter.com/jaceklaskowski

<https://twitter.com/jaceklaskowski>

Reply via email to