[ 
https://issues.apache.org/jira/browse/SPARK-34190?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Haejoon Lee updated SPARK-34190:
--------------------------------
    Description: 
There is lack of explanation for "Using Virtualenv" chapter.

It says "It packs the current virtual environment to an archive file, and It 
self-contains both Python interpreter and the dependencies", but it's not work 
if there is no Python installed for the all nodes in cluster.

Because the Python in the packed environment has a symbolic link that connects 
Python to the local one, so Python must exist in the same path on all nodes.

 

  was:
There is lack of explanation for "Using Virtualenv" chapter.

It says "It packs the current virtual environment to an archive file, and It 
self-contains both Python interpreter and the dependencies", but it's not work 
if there is no Python installed for the all nodes in cluster.

The python in the packed environment has a symbolic link that connects Python 
to the local one, so Python must exist in the same path on all nodes.

 


> Supplement the description in the document
> ------------------------------------------
>
>                 Key: SPARK-34190
>                 URL: https://issues.apache.org/jira/browse/SPARK-34190
>             Project: Spark
>          Issue Type: Documentation
>          Components: Documentation
>    Affects Versions: 3.0.1
>            Reporter: Haejoon Lee
>            Priority: Major
>
> There is lack of explanation for "Using Virtualenv" chapter.
> It says "It packs the current virtual environment to an archive file, and It 
> self-contains both Python interpreter and the dependencies", but it's not 
> work if there is no Python installed for the all nodes in cluster.
> Because the Python in the packed environment has a symbolic link that 
> connects Python to the local one, so Python must exist in the same path on 
> all nodes.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to