[ 
https://issues.apache.org/jira/browse/SPARK-6690?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yin Huai resolved SPARK-6690.
-----------------------------
       Resolution: Fixed
    Fix Version/s: 1.4.0

> spark-sql script ends up throwing Exception when event logging is enabled.
> --------------------------------------------------------------------------
>
>                 Key: SPARK-6690
>                 URL: https://issues.apache.org/jira/browse/SPARK-6690
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.0
>            Reporter: Kousuke Saruta
>            Priority: Minor
>             Fix For: 1.4.0
>
>
> When event logging is enabled, spark-sql script ends up throwing Exception 
> like as follows.
> {code}
> 15/04/03 13:51:49 INFO handler.ContextHandler: stopped 
> o.e.j.s.ServletContextHandler{/jobs,null}
> 15/04/03 13:51:49 ERROR scheduler.LiveListenerBus: Listener 
> EventLoggingListener threw an exception
> java.lang.reflect.InvocationTargetException
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$3.apply(EventLoggingListener.scala:144)
>       at 
> org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$3.apply(EventLoggingListener.scala:144)
>       at scala.Option.foreach(Option.scala:236)
>       at 
> org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:144)
>       at 
> org.apache.spark.scheduler.EventLoggingListener.onApplicationEnd(EventLoggingListener.scala:188)
>       at 
> org.apache.spark.scheduler.SparkListenerBus$class.onPostEvent(SparkListenerBus.scala:54)
>       at 
> org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
>       at 
> org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
>       at 
> org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:53)
>       at 
> org.apache.spark.util.AsynchronousListenerBus.postToAll(AsynchronousListenerBus.scala:37)
>       at 
> org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(AsynchronousListenerBus.scala:79)
>       at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1171)
>       at 
> org.apache.spark.util.AsynchronousListenerBus$$anon$1.run(AsynchronousListenerBus.scala:63)
> Caused by: java.io.IOException: Filesystem closed
>       at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:707)
>       at 
> org.apache.hadoop.hdfs.DFSOutputStream.flushOrSync(DFSOutputStream.java:1843)
>       at 
> org.apache.hadoop.hdfs.DFSOutputStream.hflush(DFSOutputStream.java:1804)
>       at 
> org.apache.hadoop.fs.FSDataOutputStream.hflush(FSDataOutputStream.java:127)
>       ... 17 more
> 15/04/03 13:51:49 INFO ui.SparkUI: Stopped Spark web UI at 
> http://sarutak-devel:4040
> 15/04/03 13:51:49 INFO scheduler.DAGScheduler: Stopping DAGScheduler
> Exception in thread "Thread-6" java.io.IOException: Filesystem closed
>       at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:707)
>       at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1760)
>       at 
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
>       at 
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
>       at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>       at 
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
>       at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1398)
>       at 
> org.apache.spark.scheduler.EventLoggingListener.stop(EventLoggingListener.scala:209)
>       at 
> org.apache.spark.SparkContext$$anonfun$stop$3.apply(SparkContext.scala:1408)
>       at 
> org.apache.spark.SparkContext$$anonfun$stop$3.apply(SparkContext.scala:1408)
>       at scala.Option.foreach(Option.scala:236)
>       at org.apache.spark.SparkContext.stop(SparkContext.scala:1408)
>       at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:66)
>       at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anon$1.run(SparkSQLCLIDriver.scala:107)
> {code}
> This is because FileSystem#close is called by the shutdown hook registered in 
> SparkSQLCLIDriver.
> {code}
>     Runtime.getRuntime.addShutdownHook(
>       new Thread() {
>         override def run() {
>           SparkSQLEnv.stop()
>         }
>       }
>     )
> {code}
> This issue was resolved by SPARK-3062 but I think, it's brought again by 
> SPARK-2261.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to