Hi SK,

Not sure if I understand you correctly, but here is how the user normally
uses the event logging functionality:

After setting "spark.eventLog.enabled" and optionally "spark.eventLog.dir",
the user runs his/her Spark application and calls sc.stop() at the end of
it. Then he/she goes to the standalone Master UI (under
http://<master-url>:8080
by default) and click on the application under the Completed Applications
table. This will link to the Spark UI of the finished application in its
completed state, under a path that looks like
"http://<master-url>:8080/history/<app-Id>".
It won't be on "http://localhost:4040"; anymore because the port is now
freed for new applications to bind their SparkUIs to. To access the file
that stores the raw statistics, go to the file specified in
"spark.eventLog.dir". This is by default "/tmp/spark-events", though in
Spark 1.0.1 it may be in HDFS under the same path.

I could be misunderstanding what you mean by the stats being buried in the
console output, because the events are not logged to the console but to a
file in "spark.eventLog.dir". For all of this to work, of course, you have
to run Spark in standalone mode (i.e. with master set to
spark://<master-url>:7077). In other modes, you will need to use the
history server instead.

Does this make sense?
Andrew


2014-08-14 18:08 GMT-07:00 SK <skrishna...@gmail.com>:

> More specifically, as indicated by Patrick above, in 1.0+, apps will have
> persistent state so that the UI can be reloaded. Is there a way to enable
> this feature in 1.0.1?
>
> thanks
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-webUI-application-details-page-tp3490p12157.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to