[
https://issues.apache.org/jira/browse/SPARK-15238?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen resolved SPARK-15238.
-------------------------------
Resolution: Fixed
Fix Version/s: 2.0.0
Issue resolved by pull request 13017
[https://github.com/apache/spark/pull/13017]
> Clarify Python 3 support in docs
> --------------------------------
>
> Key: SPARK-15238
> URL: https://issues.apache.org/jira/browse/SPARK-15238
> Project: Spark
> Issue Type: Improvement
> Components: Documentation, PySpark
> Reporter: Nicholas Chammas
> Priority: Trivial
> Fix For: 2.0.0
>
>
> The [current doc|http://spark.apache.org/docs/1.6.1/] reads:
> {quote}
> Spark runs on Java 7+, Python 2.6+ and R 3.1+. For the Scala API, Spark 1.6.1
> uses Scala 2.10. You will need to use a compatible Scala version (2.10.x).
> {quote}
> Projects that support Python 3 generally mention that explicitly. A casual
> Python user might assume from this line that Spark supports Python 2.6 and
> 2.7 but not 3+.
> More specifically, I gather from SPARK-4897 that Spark actually supports 3.4+
> and not earlier versions of Python 3.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]