[ 
https://issues.apache.org/jira/browse/HADOOP-2067?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12535388
 ] 

Raghu Angadi commented on HADOOP-2067:
--------------------------------------

A work around for user code affected by this is to use a 
{{BufferedInputStream(fsDataInputStream)}}.

> multiple close() failing in Hadoop 0.14
> ---------------------------------------
>
>                 Key: HADOOP-2067
>                 URL: https://issues.apache.org/jira/browse/HADOOP-2067
>             Project: Hadoop
>          Issue Type: Bug
>          Components: dfs
>    Affects Versions: 0.14.3
>            Reporter: lohit vijayarenu
>         Attachments: stack_trace_13_and_14.txt
>
>
> It looks like multiple close() calls, while reading files from DFS is failing 
> in hadoop 0.14. This was somehow not caught in hadoop 0.13.
> The use case was to open a file on DFS like shown below
> <code>
>  FSDataInputStream
>       fSDataInputStream =
>       fileSystem.open(new Path(propertyFileName));
>       Properties subProperties =
>       new Properties();
>       subProperties.
>       loadFromXML(fSDataInputStream);
>       fSDataInputStream.
>       close();
> </code>
> This failed with an IOException
> <exception>
> EXCEPTIN RAISED, which is java.io.IOException: Stream closed
> java.io.IOException: Stream closed
> </exception>
> The stack trace shows its being closed twice. While this used to work in 
> hadoop 0.13 which used to hide this.
> Attached with this JIRA is a text file which has stack trace for both hadoop 
> 0.13 and hadoop 0.14.
> How should this be handled from a users point of view? 
> Thanks

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to