Github user yaooqinn commented on the issue:

    https://github.com/apache/spark/pull/19840
  
    I can `spark.executorEnv.PYSPARK_PYTHON` in `sparkConf` at executor side ,  
because it is set at 
[context.py#L156](https://github.com/yaooqinn/spark/blob/8ff5663fe9a32eae79c8ee6bc310409170a8da64/python/pyspark/context.py#L156)
 by 
[conf.py#L153](https://github.com/yaooqinn/spark/blob/8ff5663fe9a32eae79c8ee6bc310409170a8da64/python/pyspark/conf.py#L153)


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to