[ 
https://issues.apache.org/jira/browse/SPARK-13587?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15178536#comment-15178536
 ] 

Juliet Hougland edited comment on SPARK-13587 at 3/3/16 9:21 PM:
-----------------------------------------------------------------

If pyspark allows users to create virtual environments, users will also want 
and need other features of python environment management on a cluster. I think 
this change would broaden the scope of PySpark to include python package 
management on a cluster. I do not think that spark should be in the business of 
creating python environments. I think the support load in terms of feature 
requests, mailing list traffic, etc would be very large. This feature would 
begin to solve a problem, but would also put us on the hook for many more. 

I agree with the general intention of this JIRA -- make it easier to manage and 
interact with complex python environments on a cluster. Perhaps there are other 
ways to accomplish this without broadening scope and functionality as much. For 
example, checking a requirements file against an environment before execution.

Edit: I see now that you are proposing a short lived virtualenv. My objections 
about the broadening of scope still apply. I generally do not agree with 
suggestions that tightly tie us (and users) to a specific method of pyenv 
management. The loose coupling of python envs one a cluster to pyspark (via a 
path to an interpreter) is a positive feature. I would much rather add 
--pyspark_python to the cli tool (and deprecate the env var) than add a ton of 
logic to create environments for users. 


was (Author: juliet):
If pyspark allows users to create virtual environments, users will also want 
and need other features of python environment management on a cluster. I think 
this change would broaden the scope of PySpark to include python package 
management on a cluster. I do not think that spark should be in the business of 
creating python environments. I think the support load in terms of feature 
requests, mailing list traffic, etc would be very large. This feature would 
begin to solve a problem, but would also put us on the hook for many more. 

I agree with the general intention of this JIRA -- make it easier to manage and 
interact with complex python environments on a cluster. Perhaps there are other 
ways to accomplish this without broadening scope and functionality as much. For 
example, checking a requirements file against an environment before execution.

> Support virtualenv in PySpark
> -----------------------------
>
>                 Key: SPARK-13587
>                 URL: https://issues.apache.org/jira/browse/SPARK-13587
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>            Reporter: Jeff Zhang
>
> Currently, it's not easy for user to add third party python packages in 
> pyspark.
> * One way is to using --py-files (suitable for simple dependency, but not 
> suitable for complicated dependency, especially with transitive dependency)
> * Another way is install packages manually on each node (time wasting, and 
> not easy to switch to different environment)
> Python has now 2 different virtualenv implementation. One is native 
> virtualenv another is through conda. This jira is trying to migrate these 2 
> tools to distributed environment



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to