@Brad

Your configuration looks alright to me. We parse both "file:/" and
"file:///" the same way so that shouldn't matter. I just tried this on the
latest master and verified that it works for me. Can you dig into the
directory "/tmp/spark-events/ml-pipeline-1408117588599" to make sure that
it's not empty? In particular, look for a file that looks like
"EVENT_LOG_0", then check the content of that file. The last event (on the
last line) of the file should be an "Application Complete" event. If this
is not true, it's likely that your application did not call "sc.stop()",
though the logs should still show up in spite of that. If all of that
fails, try logging it in a more accessible place through setting
"spark.eventLog.dir". Let me know if that helps.

@SK

You shouldn't need to capture the screen before it finishes; the whole
point of the event logging functionality is that the user doesn't have to
do that themselves. What happens if you click into the "application detail
UI"? In Spark 1.0.1, if it can't find the logs it may just refresh instead
of printing a more explicit message. However, from your configuration you
should be able to see the detailed stage information in the UI in addition
to just the summary statistics under "Completed Applications". I have
listed a few debugging steps in the paragraph above, so maybe they're also
applicable to you.

Let me know if that works,
Andrew


2014-08-15 11:07 GMT-07:00 SK <skrishna...@gmail.com>:

> Hi,
>
> Ok, I was specifying --master local. I changed that to --master
> spark://<localhostname>:7077 and am now  able to see the completed
> applications. It provides summary stats about runtime and memory usage,
> which is sufficient for me at this time.
>
> However it doesn't seem to archive the info in the "application detail UI"
> that lists detailed stats about the completed stages of the application -
> which would be useful for identifying bottleneck steps in a large
> application. I guess we need to capture the "application detail UI" screen
> before the app run completes or find a way to extract this info by  parsing
> the Json log file in /tmp/spark-events.
>
> thanks
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-webUI-application-details-page-tp3490p12187.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to