spark git commit: [SPARK-8405] [DOC] Add how to view logs on Web UI when yarn log aggregation is enabled

2015-07-27 Thread tgraves
Repository: spark
Updated Branches:
  refs/heads/branch-1.4 2b1973dd2 - a671dad62


[SPARK-8405] [DOC] Add how to view logs on Web UI when yarn log aggregation is 
enabled

Some users may not be aware that the logs are available on Web UI even if Yarn 
log aggregation is enabled. Update the doc to make this clear and what need to 
be configured.

Author: Carson Wang carson.w...@intel.com

Closes #7463 from carsonwang/YarnLogDoc and squashes the following commits:

274c054 [Carson Wang] Minor text fix
74df3a1 [Carson Wang] address comments
5a95046 [Carson Wang] Update the text in the doc
e5775c1 [Carson Wang] Update doc about how to view the logs on Web UI when yarn 
log aggregation is enabled

(cherry picked from commit 622838165756e9669cbf7af13eccbc719638f40b)
Signed-off-by: Tom Graves tgra...@yahoo-inc.com


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/a671dad6
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/a671dad6
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/a671dad6

Branch: refs/heads/branch-1.4
Commit: a671dad62362b129ae23c4c8947eaa6efa134e9f
Parents: 2b1973d
Author: Carson Wang carson.w...@intel.com
Authored: Mon Jul 27 08:02:40 2015 -0500
Committer: Tom Graves tgra...@yahoo-inc.com
Committed: Mon Jul 27 08:03:15 2015 -0500

--
 docs/running-on-yarn.md | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/a671dad6/docs/running-on-yarn.md
--
diff --git a/docs/running-on-yarn.md b/docs/running-on-yarn.md
index 07b30bf..5290b21 100644
--- a/docs/running-on-yarn.md
+++ b/docs/running-on-yarn.md
@@ -68,9 +68,9 @@ In YARN terminology, executors and application masters run 
inside containers.
 
 yarn logs -applicationId app ID
 
-will print out the contents of all log files from all containers from the 
given application. You can also view the container log files directly in HDFS 
using the HDFS shell or API. The directory where they are located can be found 
by looking at your YARN configs (`yarn.nodemanager.remote-app-log-dir` and 
`yarn.nodemanager.remote-app-log-dir-suffix`).
+will print out the contents of all log files from all containers from the 
given application. You can also view the container log files directly in HDFS 
using the HDFS shell or API. The directory where they are located can be found 
by looking at your YARN configs (`yarn.nodemanager.remote-app-log-dir` and 
`yarn.nodemanager.remote-app-log-dir-suffix`). The logs are also available on 
the Spark Web UI under the Executors Tab. You need to have both the Spark 
history server and the MapReduce history server running and configure 
`yarn.log.server.url` in `yarn-site.xml` properly. The log URL on the Spark 
history server UI will redirect you to the MapReduce history server to show the 
aggregated logs.
 
-When log aggregation isn't turned on, logs are retained locally on each 
machine under `YARN_APP_LOGS_DIR`, which is usually configured to `/tmp/logs` 
or `$HADOOP_HOME/logs/userlogs` depending on the Hadoop version and 
installation. Viewing logs for a container requires going to the host that 
contains them and looking in this directory.  Subdirectories organize log files 
by application ID and container ID.
+When log aggregation isn't turned on, logs are retained locally on each 
machine under `YARN_APP_LOGS_DIR`, which is usually configured to `/tmp/logs` 
or `$HADOOP_HOME/logs/userlogs` depending on the Hadoop version and 
installation. Viewing logs for a container requires going to the host that 
contains them and looking in this directory.  Subdirectories organize log files 
by application ID and container ID. The logs are also available on the Spark 
Web UI under the Executors Tab and doesn't require running the MapReduce 
history server.
 
 To review per-container launch environment, increase 
`yarn.nodemanager.delete.debug-delay-sec` to a
 large value (e.g. 36000), and then access the application cache through 
`yarn.nodemanager.local-dirs`


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-8405] [DOC] Add how to view logs on Web UI when yarn log aggregation is enabled

2015-07-27 Thread tgraves
Repository: spark
Updated Branches:
  refs/heads/master 72981bc8f - 622838165


[SPARK-8405] [DOC] Add how to view logs on Web UI when yarn log aggregation is 
enabled

Some users may not be aware that the logs are available on Web UI even if Yarn 
log aggregation is enabled. Update the doc to make this clear and what need to 
be configured.

Author: Carson Wang carson.w...@intel.com

Closes #7463 from carsonwang/YarnLogDoc and squashes the following commits:

274c054 [Carson Wang] Minor text fix
74df3a1 [Carson Wang] address comments
5a95046 [Carson Wang] Update the text in the doc
e5775c1 [Carson Wang] Update doc about how to view the logs on Web UI when yarn 
log aggregation is enabled


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/62283816
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/62283816
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/62283816

Branch: refs/heads/master
Commit: 622838165756e9669cbf7af13eccbc719638f40b
Parents: 72981bc
Author: Carson Wang carson.w...@intel.com
Authored: Mon Jul 27 08:02:40 2015 -0500
Committer: Tom Graves tgra...@yahoo-inc.com
Committed: Mon Jul 27 08:02:40 2015 -0500

--
 docs/running-on-yarn.md | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/62283816/docs/running-on-yarn.md
--
diff --git a/docs/running-on-yarn.md b/docs/running-on-yarn.md
index de22ab5..cac08a9 100644
--- a/docs/running-on-yarn.md
+++ b/docs/running-on-yarn.md
@@ -68,9 +68,9 @@ In YARN terminology, executors and application masters run 
inside containers.
 
 yarn logs -applicationId app ID
 
-will print out the contents of all log files from all containers from the 
given application. You can also view the container log files directly in HDFS 
using the HDFS shell or API. The directory where they are located can be found 
by looking at your YARN configs (`yarn.nodemanager.remote-app-log-dir` and 
`yarn.nodemanager.remote-app-log-dir-suffix`).
+will print out the contents of all log files from all containers from the 
given application. You can also view the container log files directly in HDFS 
using the HDFS shell or API. The directory where they are located can be found 
by looking at your YARN configs (`yarn.nodemanager.remote-app-log-dir` and 
`yarn.nodemanager.remote-app-log-dir-suffix`). The logs are also available on 
the Spark Web UI under the Executors Tab. You need to have both the Spark 
history server and the MapReduce history server running and configure 
`yarn.log.server.url` in `yarn-site.xml` properly. The log URL on the Spark 
history server UI will redirect you to the MapReduce history server to show the 
aggregated logs.
 
-When log aggregation isn't turned on, logs are retained locally on each 
machine under `YARN_APP_LOGS_DIR`, which is usually configured to `/tmp/logs` 
or `$HADOOP_HOME/logs/userlogs` depending on the Hadoop version and 
installation. Viewing logs for a container requires going to the host that 
contains them and looking in this directory.  Subdirectories organize log files 
by application ID and container ID.
+When log aggregation isn't turned on, logs are retained locally on each 
machine under `YARN_APP_LOGS_DIR`, which is usually configured to `/tmp/logs` 
or `$HADOOP_HOME/logs/userlogs` depending on the Hadoop version and 
installation. Viewing logs for a container requires going to the host that 
contains them and looking in this directory.  Subdirectories organize log files 
by application ID and container ID. The logs are also available on the Spark 
Web UI under the Executors Tab and doesn't require running the MapReduce 
history server.
 
 To review per-container launch environment, increase 
`yarn.nodemanager.delete.debug-delay-sec` to a
 large value (e.g. 36000), and then access the application cache through 
`yarn.nodemanager.local-dirs`


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org