[ 
https://issues.apache.org/jira/browse/SPARK-7776?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tathagata Das updated SPARK-7776:
---------------------------------
    Description: 
Shutdown hook to stop SparkContext was added recently. This results in ugly 
errors when a streaming application is terminated by ctrl-C.

{code}
Exception in thread "Thread-27" org.apache.spark.SparkException: Job cancelled 
because SparkContext was shut down
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:736)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:735)
        at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
        at 
org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:735)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1468)
        at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
        at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1403)
        at org.apache.spark.SparkContext.stop(SparkContext.scala:1642)
        at 
org.apache.spark.SparkContext$$anonfun$3.apply$mcV$sp(SparkContext.scala:559)
        at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2266)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(Utils.scala:2236)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(Utils.scala:2236)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(Utils.scala:2236)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1764)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2236)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2236)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2236)
        at scala.util.Try$.apply(Try.scala:161)
        at 
org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2236)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2218)
        at 
org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
{code}

This is because the Spark's shutdown hook stops the context, and the streaming 
jobs fail in the middle. The correct solution is to stop the streaming context 
before the spark context. 

  was:
Shutdown hook to stop SparkContext was added recently. This results in ugly 
errors when a streaming application is terminated by ctrl-C.

{{
Exception in thread "Thread-27" org.apache.spark.SparkException: Job cancelled 
because SparkContext was shut down
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:736)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:735)
        at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
        at 
org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:735)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1468)
        at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
        at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1403)
        at org.apache.spark.SparkContext.stop(SparkContext.scala:1642)
        at 
org.apache.spark.SparkContext$$anonfun$3.apply$mcV$sp(SparkContext.scala:559)
        at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2266)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(Utils.scala:2236)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(Utils.scala:2236)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(Utils.scala:2236)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1764)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2236)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2236)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2236)
        at scala.util.Try$.apply(Try.scala:161)
        at 
org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2236)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2218)
        at 
org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
}}

This is because the Spark's shutdown hook stops the context, and the streaming 
jobs fail in the middle. The correct solution is to stop the streaming context 
before the spark context. 


> Add shutdown hook to stop StreamingContext
> ------------------------------------------
>
>                 Key: SPARK-7776
>                 URL: https://issues.apache.org/jira/browse/SPARK-7776
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>            Reporter: Tathagata Das
>            Assignee: Tathagata Das
>            Priority: Blocker
>
> Shutdown hook to stop SparkContext was added recently. This results in ugly 
> errors when a streaming application is terminated by ctrl-C.
> {code}
> Exception in thread "Thread-27" org.apache.spark.SparkException: Job 
> cancelled because SparkContext was shut down
>       at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:736)
>       at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:735)
>       at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
>       at 
> org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:735)
>       at 
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1468)
>       at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
>       at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1403)
>       at org.apache.spark.SparkContext.stop(SparkContext.scala:1642)
>       at 
> org.apache.spark.SparkContext$$anonfun$3.apply$mcV$sp(SparkContext.scala:559)
>       at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2266)
>       at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(Utils.scala:2236)
>       at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(Utils.scala:2236)
>       at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(Utils.scala:2236)
>       at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1764)
>       at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2236)
>       at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2236)
>       at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2236)
>       at scala.util.Try$.apply(Try.scala:161)
>       at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2236)
>       at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2218)
>       at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
> {code}
> This is because the Spark's shutdown hook stops the context, and the 
> streaming jobs fail in the middle. The correct solution is to stop the 
> streaming context before the spark context. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to