[ 
https://issues.apache.org/jira/browse/SPARK-2872?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14138996#comment-14138996
 ] 

Thomas Graves commented on SPARK-2872:
--------------------------------------

adding description from spark-3557 as it explains it better:

n YarnClientSchedulerBackend, we have:
if (System.getenv(envVar) != null) {
  arrayBuf += (optionName, System.getenv(envVar))
} else if (sc.getConf.contains(sysProp)) {
  arrayBuf += (optionName, sc.getConf.get(sysProp))
}
Elsewhere in Spark we try to honor Spark configs over environment variables. 
This was introduced as a fix for the Yarn app name (SPARK-1631), but this also 
changed the behavior for other configs. Perhaps we should special case this 
particular config and correct the prioritization order of the other configs.

> Fix conflict between code and doc in YarnClientSchedulerBackend
> ---------------------------------------------------------------
>
>                 Key: SPARK-2872
>                 URL: https://issues.apache.org/jira/browse/SPARK-2872
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 1.0.0
>            Reporter: Zhihui
>
> Doc say: system properties override environment variables.
> https://github.com/apache/spark/blob/master/yarn/common/src/main/scala/org/apache/spark/scheduler/cluster/YarnClientSchedulerBackend.scala#L71
> But code is conflict with it.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to