Yes, I’m running Hadoop’s Timeline server that does this for the YARN/Hadoop
logs (and works very nicely btw). Are you saying I can do the same for the
SparkUI as well? Also, where do I set these Spark configurations since this
will be executed inside a YARN container? On the “client”
I am working on a PR that allows one to send the same spark listener event
message back to the application in yarn cluster mode.
So far I have put this function in our application, our UI will receive and
display the same spark job event message such as progress, job start, completed
etc
Hi,
Details laid out in Spark UI for the job in progress is really interesting
and very useful.
But this gets vanished once the job is done.
Is there a way to get job details post processing?
Looking for Spark UI data, not standard input,output and error info.
Thanks,
Harsha
Matt you should be able to set an HDFS path so you'll get logs written to a
unified place instead of to local disk on a random box on the cluster.
On Thu, Sep 25, 2014 at 1:38 PM, Matt Narrell matt.narr...@gmail.com
wrote:
How does this work with a cluster manager like YARN?
mn
On Sep 25,