[ 
https://issues.apache.org/jira/browse/SPARK-13587?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15201763#comment-15201763
 ] 

Mike Sukmanowsky commented on SPARK-13587:
------------------------------------------

Sorry to bug [~juliet] - any thoughts? We're currently trying to think about 
some kind of a workaround for Spark 1.6.0 working on Amazon EMR that'd allow us 
to create a conda/virtual env on YARN nodes prior to the application running 
but I don't think there's anything we can really do.

Building on my suggestion, it'd probably also be helpful to have a --teardown 
option added to spark-submit that'd allow execution of some script after the 
Spark application is terminated. This way conda/virtual envs with temporary 
names could be created/destroyed.

> Support virtualenv in PySpark
> -----------------------------
>
>                 Key: SPARK-13587
>                 URL: https://issues.apache.org/jira/browse/SPARK-13587
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>            Reporter: Jeff Zhang
>
> Currently, it's not easy for user to add third party python packages in 
> pyspark.
> * One way is to using --py-files (suitable for simple dependency, but not 
> suitable for complicated dependency, especially with transitive dependency)
> * Another way is install packages manually on each node (time wasting, and 
> not easy to switch to different environment)
> Python has now 2 different virtualenv implementation. One is native 
> virtualenv another is through conda. This jira is trying to migrate these 2 
> tools to distributed environment



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to