[ 
https://issues.apache.org/jira/browse/SPARK-26404?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17459552#comment-17459552
 ] 

Tim Sanders commented on SPARK-26404:
-------------------------------------

I'm running into this as well.  I'm on Spark 3.2.0, not using Kubernetes.

 

Setting {{spark.pyspark.python}} via {{SparkSession.builder.config}} has no 
effect, but setting {{os.environ['PYSPARK_PYTHON']}} works as expected.

Setting {{PYSPARK_PYTHON}} inside of {{spark-env.sh}} does _not_ seem to work 
with {{SparkSession}}, but it _does_ work with {{spark-submit}}.

Setting {{spark.pyspark.python}} inside of {{spark-defaults.conf}} does seem to 
work for both {{spark-submit}} and {{SparkSession}}, but it doesn't help my use 
case as I want to change the selected version at runtime based on the version 
that the client is running (without maintaining multiple configs).

 

I agree with [~vpadulan], not sure why this was marked as "Not a Problem".  Any 
chance we can get this re-opened?

> set spark.pyspark.python or PYSPARK_PYTHON doesn't work in k8s client-cluster 
> mode.
> -----------------------------------------------------------------------------------
>
>                 Key: SPARK-26404
>                 URL: https://issues.apache.org/jira/browse/SPARK-26404
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes, Spark Core
>    Affects Versions: 2.4.0
>            Reporter: Dongqing  Liu
>            Priority: Major
>
> Neither
>    conf.set("spark.executorEnv.PYSPARK_PYTHON", "/opt/pythonenvs/bin/python")
> nor 
>   conf.set("spark.pyspark.python", "/opt/pythonenvs/bin/python") 
> works. 
> Looks like the executor always picks python from PATH.
>  



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to