[ 
https://issues.apache.org/jira/browse/SPARK-37028?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

weixiuli updated SPARK-37028:
-----------------------------
    Description: 
The executor which is running in a bad node(eg. The system is overloaded or 
disks are busy) or has big GC overheads may affect the efficiency of job 
execution, although there are speculative mechanisms to resolve this 
problem,but sometimes the speculated task may also run in a bad executor.
We should have a "kill" link for each executor, similar to what we have for 
each stage, so it's easier for users to kill executors in the UI.

  was:
The executor which is running in a bad node(eg. The system is overloaded or 
disks are busy) or it has big GC overheads may affect the efficiency of job 
execution, although there are speculative mechanisms to resolve this 
problem,but sometimes the speculated task may also run in a bad executor.
We should have a "kill" link for each executor, similar to what we have for 
each stage, so it's easier for users to kill executors in the UI.


>  Add a 'kill' executor link in Web UI.
> --------------------------------------
>
>                 Key: SPARK-37028
>                 URL: https://issues.apache.org/jira/browse/SPARK-37028
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 3.3.0
>            Reporter: weixiuli
>            Priority: Major
>
> The executor which is running in a bad node(eg. The system is overloaded or 
> disks are busy) or has big GC overheads may affect the efficiency of job 
> execution, although there are speculative mechanisms to resolve this 
> problem,but sometimes the speculated task may also run in a bad executor.
> We should have a "kill" link for each executor, similar to what we have for 
> each stage, so it's easier for users to kill executors in the UI.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to