You may want to take a look at this PR:
https://github.com/apache/spark/pull/1558

Long story short: while not a terrible idea to show running
applications, your particular case should be solved differently.
Applications are responsible for calling "SparkContext.stop()" at the
end of their run, currently, so you should make sure your code does
that even when something goes wrong.

If that is done, they'll show up in the History Server.


On Thu, Oct 2, 2014 at 11:31 AM, SK <skrishna...@gmail.com> wrote:
> Hi,
>
> Currently the history server provides application details for only the
> successfully completed jobs (where the APPLICATION_COMPLETE file is
> generated). However,  (long-running) jobs that we terminate manually or
> failed jobs where the APPLICATION_COMPLETE may not be generated, dont show
> up on the history server page. They however do show up on the 4040 interface
> as long as they are running. Is it possible to save those logs and load them
> up on the history server (even when the APPLICATION_COMPLETE is not
> present)? This would allow us troubleshoot the failed and terminated jobs
> without holding up the cluster.
>
> thanks
>
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Application-details-for-failed-and-teminated-jobs-tp15627.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to