Vinod KC created SPARK-6445:
-------------------------------

             Summary: IOException: Filesystem closed  is thrown while existing 
spark-sql console
                 Key: SPARK-6445
                 URL: https://issues.apache.org/jira/browse/SPARK-6445
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.3.0
         Environment: hadoop version  2.2.0
            Reporter: Vinod KC
            Priority: Minor


When exit the console of spark-sql,  following exception thrown.
 
Exception in thread "Thread-3" java.io.IOException: Filesystem closed 
        at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:629) 
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1677) 
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1106)
 
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1102)
 
        at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
 
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1102)
 
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1397) 
        at 
org.apache.spark.scheduler.EventLoggingListener.stop(EventLoggingListener.scala:196)
 
        at 
org.apache.spark.SparkContext$$anonfun$stop$4.apply(SparkContext.scala:1388) 
        at 
org.apache.spark.SparkContext$$anonfun$stop$4.apply(SparkContext.scala:1388) 
        at scala.Option.foreach(Option.scala:236) 
        at org.apache.spark.SparkContext.stop(SparkContext.scala:1388) 
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.stop(SparkSQLEnv.scala:66) 
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$$anon$1.run(SparkSQLCLIDriver.scala:107)‍
 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to