[ https://issues.apache.org/jira/browse/SPARK-5091?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15466855#comment-15466855 ]
Semet commented on SPARK-5091: ------------------------------ It is a better option to use virtualenv and proper installation with pip, which is more scalable for Python jobs. Manipulating PYTHONPATH can lead so lot of strange behavior. See [#14180|https://github.com/apache/spark/pull/14180]. > Hooks for PySpark tasks > ----------------------- > > Key: SPARK-5091 > URL: https://issues.apache.org/jira/browse/SPARK-5091 > Project: Spark > Issue Type: New Feature > Components: PySpark > Reporter: Davies Liu > > Currently, it's not convenient to add package on executor to PYTHONPATH (we > did not assume the environment of driver an executor are identical). > It will be nice to have a hook to called before/after every tasks, then user > could manipulate sys.path by pre-task-hooks. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org