[ https://issues.apache.org/jira/browse/SPARK-13587?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15178536#comment-15178536 ]
Juliet Hougland commented on SPARK-13587: ----------------------------------------- If pyspark allows users to create virtual environments, users will also want and need other features of python environment management on a cluster. I think this change would broaden the scope of PySpark to include python package management on a cluster. I do not think that spark should be in the business of creating python environments. I think the support load in terms of feature requests, mailing list traffic, etc would be very large. This feature would begin to solve a problem, but would also put us on the hook for many more. I agree with the general intention of this JIRA -- make it easier to manage and interact with complex python environments on a cluster. Perhaps there are other ways to accomplish this without broadening scope and functionality as much. For example, checking a requirements file against an environment before execution. > Support virtualenv in PySpark > ----------------------------- > > Key: SPARK-13587 > URL: https://issues.apache.org/jira/browse/SPARK-13587 > Project: Spark > Issue Type: Improvement > Components: PySpark > Reporter: Jeff Zhang > > Currently, it's not easy for user to add third party python packages in > pyspark. > * One way is to using --py-files (suitable for simple dependency, but not > suitable for complicated dependency, especially with transitive dependency) > * Another way is install packages manually on each node (time wasting, and > not easy to switch to different environment) > Python has now 2 different virtualenv implementation. One is native > virtualenv another is through conda. This jira is trying to migrate these 2 > tools to distributed environment -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org