[jira] [Commented] (SPARK-7268) [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN

2015-07-28 Thread Andrew Or (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7268?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14644773#comment-14644773
 ] 

Andrew Or commented on SPARK-7268:
--

I just re-closed this as "Cannot reproduce" since that's the actual resolution

> [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN
> -
>
> Key: SPARK-7268
> URL: https://issues.apache.org/jira/browse/SPARK-7268
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.4.0
>Reporter: Yi Zhou
>
> {noformat}
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Interrupting 
> monitor thread
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Asking each 
> executor to shut down
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Stopped
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO spark.MapOutputTrackerMasterEndpoint: 
> MapOutputTrackerMasterEndpoint stopped!
> 15/04/30 08:26:32 ERROR util.Utils: Uncaught exception in thread Thread-0
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
> 15/04/30 08:26:32 WARN util.ShutdownHookManager: ShutdownHook '$anon$6' 
> failed, java.lang.IllegalStateException: Shutdown hooks cannot be modified 
> during shutdown.
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (SPARK-7268) [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN

2015-07-28 Thread Andrew Or (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7268?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14644774#comment-14644774
 ] 

Andrew Or commented on SPARK-7268:
--

By the way [~yizhou] is this still an issue with the latest 1.4 branch though?

> [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN
> -
>
> Key: SPARK-7268
> URL: https://issues.apache.org/jira/browse/SPARK-7268
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.4.0
>Reporter: Yi Zhou
>
> {noformat}
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Interrupting 
> monitor thread
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Asking each 
> executor to shut down
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Stopped
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO spark.MapOutputTrackerMasterEndpoint: 
> MapOutputTrackerMasterEndpoint stopped!
> 15/04/30 08:26:32 ERROR util.Utils: Uncaught exception in thread Thread-0
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
> 15/04/30 08:26:32 WARN util.ShutdownHookManager: ShutdownHook '$anon$6' 
> failed, java.lang.IllegalStateException: Shutdown hooks cannot be modified 
> during shutdown.
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (SPARK-7268) [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN

2015-07-28 Thread Yi Zhou (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7268?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14644462#comment-14644462
 ] 

Yi Zhou commented on SPARK-7268:


In latest spark 1.5 master code , the issue is not existed . So closed it.

> [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN
> -
>
> Key: SPARK-7268
> URL: https://issues.apache.org/jira/browse/SPARK-7268
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.4.0
>Reporter: Yi Zhou
>
> {noformat}
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Interrupting 
> monitor thread
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Asking each 
> executor to shut down
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Stopped
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO spark.MapOutputTrackerMasterEndpoint: 
> MapOutputTrackerMasterEndpoint stopped!
> 15/04/30 08:26:32 ERROR util.Utils: Uncaught exception in thread Thread-0
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
> 15/04/30 08:26:32 WARN util.ShutdownHookManager: ShutdownHook '$anon$6' 
> failed, java.lang.IllegalStateException: Shutdown hooks cannot be modified 
> during shutdown.
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (SPARK-7268) [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN

2015-05-27 Thread Yin Huai (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7268?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14561797#comment-14561797
 ] 

Yin Huai commented on SPARK-7268:
-

cc [~lian cheng]

> [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN
> -
>
> Key: SPARK-7268
> URL: https://issues.apache.org/jira/browse/SPARK-7268
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.4.0
>Reporter: Yi Zhou
>
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Interrupting 
> monitor thread
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Asking each 
> executor to shut down
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Stopped
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO spark.MapOutputTrackerMasterEndpoint: 
> MapOutputTrackerMasterEndpoint stopped!
> 15/04/30 08:26:32 ERROR util.Utils: Uncaught exception in thread Thread-0
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
> 15/04/30 08:26:32 WARN util.ShutdownHookManager: ShutdownHook '$anon$6' 
> failed, java.lang.IllegalStateException: Shutdown hooks cannot be modified 
> during shutdown.
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr.

[jira] [Commented] (SPARK-7268) [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN

2015-05-27 Thread Marcelo Vanzin (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7268?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14561791#comment-14561791
 ] 

Marcelo Vanzin commented on SPARK-7268:
---

The log comes from https://github.com/apache/spark/pull/5672 ; it isn't 
actually throwing an exception, just logging it. Perhaps that's a little too 
scary and the log should be removed, if that situation is expected?

> [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN
> -
>
> Key: SPARK-7268
> URL: https://issues.apache.org/jira/browse/SPARK-7268
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.4.0
>Reporter: Yi Zhou
>
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Interrupting 
> monitor thread
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Asking each 
> executor to shut down
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Stopped
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO spark.MapOutputTrackerMasterEndpoint: 
> MapOutputTrackerMasterEndpoint stopped!
> 15/04/30 08:26:32 ERROR util.Utils: Uncaught exception in thread Thread-0
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
> 15/04/30 08:26:32 WARN util.ShutdownHookManager: ShutdownHook '$anon$6' 
> failed, java.lang.IllegalStateException: Shutdown hooks cannot be modified 
> during shutdown.
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHook

[jira] [Commented] (SPARK-7268) [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN

2015-05-27 Thread Marcelo Vanzin (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7268?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14561789#comment-14561789
 ] 

Marcelo Vanzin commented on SPARK-7268:
---

The log comes from https://github.com/apache/spark/pull/5672 ; it isn't 
actually throwing an exception, just logging it. Perhaps that's a little too 
scary and the log should be removed, if that situation is expected?

> [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN
> -
>
> Key: SPARK-7268
> URL: https://issues.apache.org/jira/browse/SPARK-7268
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.4.0
>Reporter: Yi Zhou
>
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Interrupting 
> monitor thread
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Asking each 
> executor to shut down
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Stopped
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO spark.MapOutputTrackerMasterEndpoint: 
> MapOutputTrackerMasterEndpoint stopped!
> 15/04/30 08:26:32 ERROR util.Utils: Uncaught exception in thread Thread-0
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
> 15/04/30 08:26:32 WARN util.ShutdownHookManager: ShutdownHook '$anon$6' 
> failed, java.lang.IllegalStateException: Shutdown hooks cannot be modified 
> during shutdown.
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHook

[jira] [Commented] (SPARK-7268) [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN

2015-05-27 Thread Andrew Or (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7268?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14561784#comment-14561784
 ] 

Andrew Or commented on SPARK-7268:
--

[~vanzin] Is this related to the recent shutdown hook change?

> [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN
> -
>
> Key: SPARK-7268
> URL: https://issues.apache.org/jira/browse/SPARK-7268
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.4.0
>Reporter: Yi Zhou
>
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Interrupting 
> monitor thread
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Asking each 
> executor to shut down
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Stopped
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO spark.MapOutputTrackerMasterEndpoint: 
> MapOutputTrackerMasterEndpoint stopped!
> 15/04/30 08:26:32 ERROR util.Utils: Uncaught exception in thread Thread-0
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
> 15/04/30 08:26:32 WARN util.ShutdownHookManager: ShutdownHook '$anon$6' 
> failed, java.lang.IllegalStateException: Shutdown hooks cannot be modified 
> during shutdown.
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---

[jira] [Commented] (SPARK-7268) [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN

2015-05-27 Thread Andrew Or (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7268?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14561774#comment-14561774
 ] 

Andrew Or commented on SPARK-7268:
--

[~vanzin] Is this related to the recent shutdown hook change?

> [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN
> -
>
> Key: SPARK-7268
> URL: https://issues.apache.org/jira/browse/SPARK-7268
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.4.0
>Reporter: Yi Zhou
>
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Interrupting 
> monitor thread
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Asking each 
> executor to shut down
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Stopped
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO spark.MapOutputTrackerMasterEndpoint: 
> MapOutputTrackerMasterEndpoint stopped!
> 15/04/30 08:26:32 ERROR util.Utils: Uncaught exception in thread Thread-0
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
> 15/04/30 08:26:32 WARN util.ShutdownHookManager: ShutdownHook '$anon$6' 
> failed, java.lang.IllegalStateException: Shutdown hooks cannot be modified 
> during shutdown.
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---

[jira] [Commented] (SPARK-7268) [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN

2015-05-27 Thread Yin Huai (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7268?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14561601#comment-14561601
 ] 

Yin Huai commented on SPARK-7268:
-

[~andrewor14] Do you think that we are shutting down the CLI driver in the 
wrong way?

> [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN
> -
>
> Key: SPARK-7268
> URL: https://issues.apache.org/jira/browse/SPARK-7268
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.4.0
>Reporter: Yi Zhou
>
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Interrupting 
> monitor thread
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Asking each 
> executor to shut down
> 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Stopped
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO 
> scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
> OutputCommitCoordinator stopped!
> 15/04/30 08:26:32 INFO spark.MapOutputTrackerMasterEndpoint: 
> MapOutputTrackerMasterEndpoint stopped!
> 15/04/30 08:26:32 ERROR util.Utils: Uncaught exception in thread Thread-0
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
> 15/04/30 08:26:32 WARN util.ShutdownHookManager: ShutdownHook '$anon$6' 
> failed, java.lang.IllegalStateException: Shutdown hooks cannot be modified 
> during shutdown.
> java.lang.IllegalStateException: Shutdown hooks cannot be modified during 
> shutdown.
> at 
> org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191)
> at 
> org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185)
> at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138)
> at 
> org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
> at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214)
> at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94)
> at org.apache.spark.SparkContext.stop(SparkContext.scala:1511)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67)
> at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173)
> at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724)
> at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173)
> at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155)
> at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)