[ https://issues.apache.org/jira/browse/SPARK-7268?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Cheng Hao updated SPARK-7268: ----------------------------- Target Version/s: 1.4.0 > [Spark SQL] Throw 'Shutdown hooks cannot be modified during shutdown' on YARN > ----------------------------------------------------------------------------- > > Key: SPARK-7268 > URL: https://issues.apache.org/jira/browse/SPARK-7268 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.4.0 > Reporter: Yi Zhou > > 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Interrupting > monitor thread > 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Asking each > executor to shut down > 15/04/30 08:26:32 INFO cluster.YarnClientSchedulerBackend: Stopped > 15/04/30 08:26:32 INFO > scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: > OutputCommitCoordinator stopped! > 15/04/30 08:26:32 INFO > scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: > OutputCommitCoordinator stopped! > 15/04/30 08:26:32 INFO spark.MapOutputTrackerMasterEndpoint: > MapOutputTrackerMasterEndpoint stopped! > 15/04/30 08:26:32 ERROR util.Utils: Uncaught exception in thread Thread-0 > java.lang.IllegalStateException: Shutdown hooks cannot be modified during > shutdown. > at > org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191) > at > org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185) > at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138) > at > org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151) > at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214) > at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94) > at org.apache.spark.SparkContext.stop(SparkContext.scala:1511) > at > org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67) > at > org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105) > at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204) > at > org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173) > at > org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173) > at > org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173) > at > org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724) > at > org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173) > at > org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155) > at > org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54) > 15/04/30 08:26:32 WARN util.ShutdownHookManager: ShutdownHook '$anon$6' > failed, java.lang.IllegalStateException: Shutdown hooks cannot be modified > during shutdown. > java.lang.IllegalStateException: Shutdown hooks cannot be modified during > shutdown. > at > org.apache.spark.util.SparkShutdownHookManager.checkState(Utils.scala:2191) > at > org.apache.spark.util.SparkShutdownHookManager.remove(Utils.scala:2185) > at org.apache.spark.util.Utils$.removeShutdownHook(Utils.scala:2138) > at > org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151) > at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1214) > at org.apache.spark.SparkEnv.stop(SparkEnv.scala:94) > at org.apache.spark.SparkContext.stop(SparkContext.scala:1511) > at > org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:67) > at > org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anonfun$main$1.apply$mcV$sp(SparkSQLCLIDriver.scala:105) > at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2204) > at > org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2173) > at > org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173) > at > org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2173) > at > org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1724) > at > org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2173) > at > org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2155) > at > org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54) -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org