Github user holdenk commented on the issue:

    https://github.com/apache/spark/pull/15659
  
    I think we should try and get so that it can be a part of the 2.1 release 
so we can have publishing to PyPI added in 2.1.1 or 2.2. We've been looking at 
making PySpark installable with pip since April of 2014 in one form or another 
and from the discussions I think its pretty clear this could make a big 
difference in Spark adoption in the Python community.
    
    The most obvious non-additive change is changing how the scripts resolve 
SPARK_HOME, but I think if were extra careful around that it would be ok to 
merge to 2.1 after the initial branch is cut.
    
    That being said of course as the author of the most recent iteration of 
this I've got my own biases at play.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to