[ https://issues.apache.org/jira/browse/SPARK-6469?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Christophe PRÉAUD updated SPARK-6469: ------------------------------------- Attachment: TestYarnVars.scala Attached a simple application to check the value of the {{CONTAINER_ID}} environment variable. * Check in yarn-cluster mode {code} /opt/spark/bin/spark-submit --master yarn-cluster --class TestYarnVars --queue spark-batch testyarnvars_2.10-1.0.jar 2>/dev/null {code} (the stdout of the application on the YARN wen ui reads: {{CONTAINER_ID: container_1426666761810_0151_01_000001}} * Check in yarn-client mode: {code} /opt/spark/bin/spark-submit --master yarn-client --class TestYarnVars --queue spark-batch testyarnvars_2.10-1.0.jar 2>/dev/null {code} CONTAINER_ID: null > Local directories configured for YARN are not used in yarn-client mode > ---------------------------------------------------------------------- > > Key: SPARK-6469 > URL: https://issues.apache.org/jira/browse/SPARK-6469 > Project: Spark > Issue Type: Bug > Components: Spark Core > Reporter: Christophe PRÉAUD > Priority: Minor > Attachments: TestYarnVars.scala > > > According to the [Spark YARN doc > page|http://spark.apache.org/docs/latest/running-on-yarn.html#important-notes], > Spark executors will use the local directories configured for YARN, not > spark.local.dir which should be ignored. > If this works correctly in yarn-cluster mode, I've found out that it is not > the case in yarn-client mode. > The problem seems to originate in the method > [isRunningInYarnContainer|https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/Utils.scala#L686]. > Indeed, I've checked with a simple application that the {{CONTAINER_ID}} > environment variable is correctly set in yarn-cluster mode (to something like > {{container_1426666761810_0151_01_000001}}, but not in yarn-client mode. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org