Hi,

Ok, I was specifying --master local. I changed that to --master
spark://<localhostname>:7077 and am now  able to see the completed
applications. It provides summary stats about runtime and memory usage,
which is sufficient for me at this time. 

However it doesn't seem to archive the info in the "application detail UI"
that lists detailed stats about the completed stages of the application -
which would be useful for identifying bottleneck steps in a large
application. I guess we need to capture the "application detail UI" screen
before the app run completes or find a way to extract this info by  parsing
the Json log file in /tmp/spark-events.

thanks



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-webUI-application-details-page-tp3490p12187.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to