[
https://issues.apache.org/jira/browse/SPARK-1202?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13954503#comment-13954503
]
ASF GitHub Bot commented on SPARK-1202:
---------------------------------------
Github user kayousterhout commented on a diff in the pull request:
https://github.com/apache/spark/pull/246#discussion_r11095648
--- Diff:
core/src/main/scala/org/apache/spark/ui/jobs/JobProgressListener.scala ---
@@ -116,6 +118,16 @@ private[ui] class JobProgressListener(conf: SparkConf)
extends SparkListener {
val stages = poolToActiveStages.getOrElseUpdate(poolName, new
HashMap[Int, StageInfo]())
stages(stage.stageId) = stage
+
+ // Extract Job ID and double check if we have the details
+ val jobId = Option(stageSubmitted.properties).flatMap {
+ p => Option(p.getProperty("spark.job.id"))
+ }.getOrElse("-1").toInt
--- End diff --
Ah cool -- I looked at the ordering of the JobStart and StageSubmitted
events more closely and I think you can safely remove this.
> Add a "cancel" button in the UI for stages
> ------------------------------------------
>
> Key: SPARK-1202
> URL: https://issues.apache.org/jira/browse/SPARK-1202
> Project: Apache Spark
> Issue Type: New Feature
> Components: Web UI
> Reporter: Patrick Cogan
> Assignee: Sundeep Narravula
> Priority: Critical
> Fix For: 1.0.0
>
>
> Seems like this would be really useful for people. It's not that hard, we
> just need to lookup the jobs associated with the stage and kill them. Might
> involve exposing some additional API's in SparkContext.
--
This message was sent by Atlassian JIRA
(v6.2#6252)