Repository: spark
Updated Branches:
  refs/heads/branch-2.0 074989af9 -> 17f43cc87


[YARN][DOC][MINOR] Remove several obsolete env variables and update the doc

## What changes were proposed in this pull request?

Remove several obsolete env variables not supported for Spark on YARN now, also 
updates the docs to include several changes with 2.0.

## How was this patch tested?

N/A

CC vanzin tgravescs

Author: jerryshao <ss...@hortonworks.com>

Closes #13296 from jerryshao/yarn-doc.

(cherry picked from commit 1b98fa2e4382d3d8385cf1ac25d7fd3ae5650475)
Signed-off-by: Marcelo Vanzin <van...@cloudera.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/17f43cc8
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/17f43cc8
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/17f43cc8

Branch: refs/heads/branch-2.0
Commit: 17f43cc87ed8b3c77b7c34163340da8e2da48eb1
Parents: 074989a
Author: jerryshao <ss...@hortonworks.com>
Authored: Fri May 27 11:31:25 2016 -0700
Committer: Marcelo Vanzin <van...@cloudera.com>
Committed: Fri May 27 11:31:37 2016 -0700

----------------------------------------------------------------------
 conf/spark-env.sh.template | 4 ----
 docs/running-on-yarn.md    | 4 ++++
 2 files changed, 4 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/17f43cc8/conf/spark-env.sh.template
----------------------------------------------------------------------
diff --git a/conf/spark-env.sh.template b/conf/spark-env.sh.template
index a031cd6..9cffdc3 100755
--- a/conf/spark-env.sh.template
+++ b/conf/spark-env.sh.template
@@ -40,10 +40,6 @@
 # - SPARK_EXECUTOR_CORES, Number of cores for the executors (Default: 1).
 # - SPARK_EXECUTOR_MEMORY, Memory per Executor (e.g. 1000M, 2G) (Default: 1G)
 # - SPARK_DRIVER_MEMORY, Memory for Driver (e.g. 1000M, 2G) (Default: 1G)
-# - SPARK_YARN_APP_NAME, The name of your application (Default: Spark)
-# - SPARK_YARN_QUEUE, The hadoop queue to use for allocation requests 
(Default: 'default')
-# - SPARK_YARN_DIST_FILES, Comma separated list of files to be distributed 
with the job.
-# - SPARK_YARN_DIST_ARCHIVES, Comma separated list of archives to be 
distributed with the job.
 
 # Options for the daemons used in the standalone deploy mode
 # - SPARK_MASTER_IP, to bind the master to a different IP address or hostname

http://git-wip-us.apache.org/repos/asf/spark/blob/17f43cc8/docs/running-on-yarn.md
----------------------------------------------------------------------
diff --git a/docs/running-on-yarn.md b/docs/running-on-yarn.md
index f2fbe3c..9833806 100644
--- a/docs/running-on-yarn.md
+++ b/docs/running-on-yarn.md
@@ -60,6 +60,8 @@ Running Spark on YARN requires a binary distribution of Spark 
which is built wit
 Binary distributions can be downloaded from the [downloads 
page](http://spark.apache.org/downloads.html) of the project website.
 To build Spark yourself, refer to [Building Spark](building-spark.html).
 
+To make Spark runtime jars accessible from YARN side, you can specify 
`spark.yarn.archive` or `spark.yarn.jars`. For details please refer to [Spark 
Properties](running-on-yarn.html#spark-properties). If neither 
`spark.yarn.archive` nor `spark.yarn.jars` is specified, Spark will create a 
zip file with all jars under `$SPARK_HOME/jars` and upload it to the 
distributed cache.
+
 # Configuration
 
 Most of the configs are the same for Spark on YARN as for other deployment 
modes. See the [configuration page](configuration.html) for more information on 
those.  These are configs that are specific to Spark on YARN.
@@ -99,6 +101,8 @@ to the same log file).
 
 If you need a reference to the proper location to put log files in the YARN so 
that YARN can properly display and aggregate them, use 
`spark.yarn.app.container.log.dir` in your `log4j.properties`. For example, 
`log4j.appender.file_appender.File=${spark.yarn.app.container.log.dir}/spark.log`.
 For streaming applications, configuring `RollingFileAppender` and setting file 
location to YARN's log directory will avoid disk overflow caused by large log 
files, and logs can be accessed using YARN's log utility.
 
+To use a custom metrics.properties for the application master and executors, 
update the `$SPARK_CONF_DIR/metrics.properties` file. It will automatically be 
uploaded with other configurations, so you don't need to specify it manually 
with `--files`.
+
 #### Spark Properties
 
 <table class="table">


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to