Is the deprecation of JDK 7 and Scala 2.10 documented anywhere outside the
release notes for Spark 2.0.0? I do not consider release notes to be
sufficient public notice for deprecation of supported platforms - this
should be noted in the documentation somewhere. Here are on the only
mentions I could find:

At http://spark.apache.org/downloads.html it says:

"*Note: Starting version 2.0, Spark is built with Scala 2.11 by default.
Scala 2.10 users should download the Spark source package and build with
Scala 2.10 support
<http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-210>."*

At http://spark.apache.org/docs/latest/#downloading it says:

"Spark runs on Java 7+, Python 2.6+/3.4+ and R 3.1+. For the Scala API,
Spark 2.0.1 uses Scala 2.11. You will need to use a compatible Scala
version (2.11.x)."

At
http://spark.apache.org/docs/latest/programming-guide.html#linking-with-spark
it says:

   - "Spark 2.0.1 is built and distributed to work with Scala 2.11 by
   default. (Spark can be built to work with other versions of Scala, too.) To
   write applications in Scala, you will need to use a compatible Scala
   version (e.g. 2.11.X)."
   - "Spark 2.0.1 works with Java 7 and higher. If you are using Java 8,
   Spark supports lambda expressions
   <http://docs.oracle.com/javase/tutorial/java/javaOO/lambdaexpressions.html>
   for concisely writing functions, otherwise you can use the classes in the
   org.apache.spark.api.java.function
   
<http://spark.apache.org/docs/latest/api/java/index.html?org/apache/spark/api/java/function/package-summary.html>
   package."
   - "Spark 2.0.1 works with Python 2.6+ or Python 3.4+. It can use the
   standard CPython interpreter, so C libraries like NumPy can be used. It
   also works with PyPy 2.3+."

Reply via email to