[ 
https://issues.apache.org/jira/browse/SPARK-23635?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Saisai Shao updated SPARK-23635:
--------------------------------
    Description: 
In the current Spark on YARN code, AM always will copy and overwrite its env 
variables to executors, so we cannot set different values for executors.

To reproduce issue, user could start spark-shell like:
{code:java}
./bin/spark-shell --master yarn-client --conf 
spark.executorEnv.SPARK_ABC=executor_val --conf  
spark.yarn.appMasterEnv.SPARK_ABC=am_val

{code}
Then check executor env variables by
{code:java}
sc.parallelize(1 to 1).flatMap \{ i => sys.env.toSeq }.collect.foreach(println)

{code}
You will always get {{am_val}} instead of {{executor_val}}. So we should not 
let AM to overwrite specifically set executor env variables.

  was:
In the current Spark on YARN code, AM always will copy and overwrite its env 
variables to executors, so we cannot set different values of executors.

To reproduce issue, user could start spark-shell like:

{code}

./bin/spark-shell --master yarn-client --conf 
spark.executorEnv.SPARK_ABC=executor_val --conf  
spark.yarn.appMasterEnv.SPARK_ABC=am_val

{code}

Then check executor env variables by

{code}

sc.parallelize(1 to 1).flatMap \{ i => sys.env.toSeq }.collect.foreach(println)

{code}

You will always get \{{am_val}} instead of {{executor_val}}. So we should not 
let AM to overwrite specifically set executor env variables.


> Spark executor env variable is overwritten by same name AM env variable
> -----------------------------------------------------------------------
>
>                 Key: SPARK-23635
>                 URL: https://issues.apache.org/jira/browse/SPARK-23635
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 2.3.0
>            Reporter: Saisai Shao
>            Priority: Minor
>
> In the current Spark on YARN code, AM always will copy and overwrite its env 
> variables to executors, so we cannot set different values for executors.
> To reproduce issue, user could start spark-shell like:
> {code:java}
> ./bin/spark-shell --master yarn-client --conf 
> spark.executorEnv.SPARK_ABC=executor_val --conf  
> spark.yarn.appMasterEnv.SPARK_ABC=am_val
> {code}
> Then check executor env variables by
> {code:java}
> sc.parallelize(1 to 1).flatMap \{ i => sys.env.toSeq 
> }.collect.foreach(println)
> {code}
> You will always get {{am_val}} instead of {{executor_val}}. So we should not 
> let AM to overwrite specifically set executor env variables.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to