[ 
https://issues.apache.org/jira/browse/HADOOP-9114?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

liuyang updated HADOOP-9114:
----------------------------

    Status: Patch Available  (was: Open)
    
> After defined the dfs.checksum.type as the NULL, write file and hflush will 
> through java.lang.ArrayIndexOutOfBoundsException
> ----------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-9114
>                 URL: https://issues.apache.org/jira/browse/HADOOP-9114
>             Project: Hadoop Common
>          Issue Type: Bug
>    Affects Versions: 2.0.1-alpha
>            Reporter: liuyang
>         Attachments: FSOutputSummer.java.patch
>
>
> when I test the characteristic parameter about dfs.checksum.type. The value 
> can be defined as NULL,CRC32C,CRC32. It's ok when the value is CRC32C or 
> CRC32, but the client will through java.lang.ArrayIndexOutOfBoundsException 
> when the value is configured NULL.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to