Github user rgbkrk commented on the issue:

    https://github.com/apache/spark/pull/8318
  
    > How will it work if users want to run a different version of PySpark from 
a different version of Spark (maybe something they installed locally)? How can 
they easily swap that out? We don't want this making it harder to use Spark 
against a real cluster because the version you got from pip is wrong.
    
    They have to deal with normal Python packaging semantics. Right now, _not_ 
making it pip installable and importable actually makes it harder for us. We 
then rely on [findspark](https://github.com/minrk/findspark) to resolve the 
package (plus some amount of ritual to start the JVM...) In case you're 
wondering, yes I use Spark against a real live large cluster and so do users I 
support.
    
    > Can we make an account that's shared by all the committers somehow?
    
    You can. However, it's easier to give access rights to each individual on 
PyPI.
    
    > Can we sign releases?
    
    Yes, you can GPG sign them.
    
    >  In particular, does anyone have examples of other ASF projects that 
publish to PyPI?
    
    [libcloud](https://libcloud.apache.org/)
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to