[ 
https://issues.apache.org/jira/browse/SPARK-36088?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17461230#comment-17461230
 ] 

jingxiong zhong commented on SPARK-36088:
-----------------------------------------

In cluster mode, I hava another question that when I unzip python3.6.6.zip in 
pod , but no permission to execute, my execute operation as follows:

{code:shell}
spark-submit \
--archives ./python3.6.6.zip#python3.6.6 \
--conf "spark.pyspark.python=python3.6.6/python3.6.6/bin/python3" \
--conf "spark.pyspark.driver.python=python3.6.6/python3.6.6/bin/python3" \
--conf spark.kubernetes.container.image.pullPolicy=Always \
./examples/src/main/python/pi.py 100
{code}


> 'spark.archives' does not extract the archive file into the driver under 
> client mode
> ------------------------------------------------------------------------------------
>
>                 Key: SPARK-36088
>                 URL: https://issues.apache.org/jira/browse/SPARK-36088
>             Project: Spark
>          Issue Type: Improvement
>          Components: Kubernetes, Spark Submit
>    Affects Versions: 3.1.2
>            Reporter: rickcheng
>            Priority: Major
>
> When running spark in the k8s cluster, there are 2 deploy modes: cluster and 
> client. After my test, in the cluster mode, *spark.archives* can extract the 
> archive file to the working directory of the executors and driver. But in 
> client mode, *spark.archives* can only extract the archive file to the 
> working directory of the executors.
>  
> However, I need *spark.archives* to send the virtual environment tar file 
> packaged by conda to both the driver and executors under client mode (So that 
> the executor and the driver have the same python environment).
>  
> Why *spark.archives* does not extract the archive file into the working 
> directory of the driver under client mode?



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to