Github user ericl commented on a diff in the pull request:

    https://github.com/apache/spark/pull/17166#discussion_r106051697
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
    @@ -2250,6 +2250,25 @@ class SparkContext(config: SparkConf) extends 
Logging {
       }
     
       /**
    +   * Kill a given task. It will be retried.
    +   *
    +   * @param taskId the task ID to kill
    +   */
    +  def killTask(taskId: Long): Unit = {
    +    killTask(taskId, "cancelled")
    +  }
    +
    +  /**
    +   * Kill a given task. It will be retried.
    +   *
    +   * @param taskId the task ID to kill
    +   * @param reason the reason for killing the task, which should be a 
short string
    +   */
    +  def killTask(taskId: Long, reason: String): Unit = {
    --- End diff --
    
    > What is the expectation when a task is being killed.
    > Is it specifically for the task being referenced; or all attempts of the 
task ?
    
    The current task attempt (which is uniquely identifier by the task id). I 
updated the docs as suggested here.
    
    > "killAndRescheduleTask" implies it will be rescheduled - which might not 
occur in case this was a speculative task (or already completed) : would be 
good to clarify.
    
    Went with killTaskAttempt.
    
    > Is this expected to be exposed via the UI ?
    > How is it to be leveraged (if not via UI) ?
    
    For now, you can look at the Spark UI, find the task ID, and call 
killTaskAttempt on it. It would be nice to have this as a button on the 
executor page in a follow-up. You can also have a listener that kills tasks as 
suggested.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to