Repository: spark
Updated Branches:
  refs/heads/master e2b3d2367 -> 34767997e


Small rewording about history server use case

Hello
PR #10991 removed the built-in history view from Spark Standalone, so the 
history server is no longer useful to Yarn or Mesos only.

Author: Hervé <dud...@users.noreply.github.com>

Closes #17709 from dud225/patch-1.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/34767997
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/34767997
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/34767997

Branch: refs/heads/master
Commit: 34767997e0c6cb28e1fac8cb650fa3511f260ca5
Parents: e2b3d23
Author: Hervé <dud...@users.noreply.github.com>
Authored: Fri Apr 21 08:52:18 2017 +0100
Committer: Sean Owen <so...@cloudera.com>
Committed: Fri Apr 21 08:52:18 2017 +0100

----------------------------------------------------------------------
 docs/monitoring.md | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/34767997/docs/monitoring.md
----------------------------------------------------------------------
diff --git a/docs/monitoring.md b/docs/monitoring.md
index da95438..3e577c5 100644
--- a/docs/monitoring.md
+++ b/docs/monitoring.md
@@ -27,8 +27,8 @@ in the UI to persisted storage.
 
 ## Viewing After the Fact
 
-If Spark is run on Mesos or YARN, it is still possible to construct the UI of 
an
-application through Spark's history server, provided that the application's 
event logs exist.
+It is still possible to construct the UI of an application through Spark's 
history server, 
+provided that the application's event logs exist.
 You can start the history server by executing:
 
     ./sbin/start-history-server.sh


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to