[ 
https://issues.apache.org/jira/browse/SPARK-3835?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14165722#comment-14165722
 ] 

Matt Cheah edited comment on SPARK-3835 at 10/9/14 8:48 PM:
------------------------------------------------------------

Any updates on this? I've tried tackling it myself but I'm actually not sure 
how possible this is - killing a JVM just causes a DisassociatedEvent to be 
fired... but a DisassociatedEvent is also fired if SparkContext.stop() is 
called, making it hard to tell if a context was stopped gracefully or 
forcefully.


was (Author: mcheah):
Any updates on this?

> Spark applications that are killed should show up as "KILLED" or "CANCELLED" 
> in the Spark UI
> --------------------------------------------------------------------------------------------
>
>                 Key: SPARK-3835
>                 URL: https://issues.apache.org/jira/browse/SPARK-3835
>             Project: Spark
>          Issue Type: Improvement
>          Components: Web UI
>    Affects Versions: 1.1.0
>            Reporter: Matt Cheah
>              Labels: UI
>
> Spark applications that crash or are killed are listed as FINISHED in the 
> Spark UI.
> It looks like the Master only passes back a list of "Running" applications 
> and a list of "Completed" applications, All of the applications under 
> "Completed" have status "FINISHED", however if they were killed manually they 
> should show "CANCELLED", or if they failed they should read "FAILED".



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to