[ 
https://issues.apache.org/jira/browse/HIVE-16422?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sahil Takiar updated HIVE-16422:
--------------------------------
    Issue Type: Sub-task  (was: Bug)
        Parent: HIVE-20271

> Should kill running Spark Jobs when a query is cancelled.
> ---------------------------------------------------------
>
>                 Key: HIVE-16422
>                 URL: https://issues.apache.org/jira/browse/HIVE-16422
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>    Affects Versions: 2.1.0
>            Reporter: zhihai xu
>            Assignee: zhihai xu
>            Priority: Major
>             Fix For: 3.0.0
>
>         Attachments: HIVE-16422.000.txt
>
>
> Should kill running Spark Jobs when a query is cancelled. When a query is 
> cancelled, Driver.releaseDriverContext will be called by Driver.close. 
> releaseDriverContext will call DriverContext.shutdown which will call all the 
> running tasks' shutdown.
> {code}
>   public synchronized void shutdown() {
>     LOG.debug("Shutting down query " + ctx.getCmd());
>     shutdown = true;
>     for (TaskRunner runner : running) {
>       if (runner.isRunning()) {
>         Task<?> task = runner.getTask();
>         LOG.warn("Shutting down task : " + task);
>         try {
>           task.shutdown();
>         } catch (Exception e) {
>           console.printError("Exception on shutting down task " + 
> task.getId() + ": " + e);
>         }
>         Thread thread = runner.getRunner();
>         if (thread != null) {
>           thread.interrupt();
>         }
>       }
>     }
>     running.clear();
>   }
> {code}
> since SparkTask didn't implement shutdown method to kill the running spark 
> job, the spark job may be still running after the query is cancelled. So it 
> will be good to kill the spark job in SparkTask.shutdown to save cluster 
> resource.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to