I've help to build a conda installable spark packages in the past.  You can
an older recipe here:
https://github.com/conda/conda-recipes/tree/master/spark

And I've been updating packages here: 
https://anaconda.org/anaconda-cluster/spark

`conda install -c anaconda-cluster spark` 

The above should work for OSX/Linux-64 and py27/py34 

--Ben 




--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/PySpark-on-PyPi-tp12626p13659.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to