Hi, I prefer that PySpark can also be executed on Python 3.
Do you have some reason or demand to use PySpark through Python3? If you create an issue on JIRA, I would try to resolve it. On 4 October 2014 06:47, Gen <gen.tan...@gmail.com> wrote: > According to the official site of spark, for the latest version of > spark(1.1.0), it does not work with python 3 > > Spark 1.1.0 works with Python 2.6 or higher (but not Python 3). It uses the > standard CPython interpreter, so C libraries like NumPy can be used. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/pyspark-on-python-3-tp15706p15707.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > > -- class Cocoatomo: name = 'cocoatomo' email_address = 'cocoatom...@gmail.com' twitter_id = '@cocoatomo'