Hi, all:



When I exit the console of spark-sql, the following exception throwed......


My spark version is 1.3.0, hadoop version is 2.2.0


Exception in thread "Thread-3" java.io.IOException: Filesystem closed
        at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:629)
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1677)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1106)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1102)
        at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1102)
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1397)
        at 
org.apache.spark.scheduler.EventLoggingListener.stop(EventLoggingListener.scala:196)
        at 
org.apache.spark.SparkContext$$anonfun$stop$4.apply(SparkContext.scala:1388)
        at 
org.apache.spark.SparkContext$$anonfun$stop$4.apply(SparkContext.scala:1388)
        at scala.Option.foreach(Option.scala:236)
        at org.apache.spark.SparkContext.stop(SparkContext.scala:1388)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:66)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anon$1.run(SparkSQLCLIDriver.scala:107)‍

Reply via email to