[ 
https://issues.apache.org/jira/browse/HADOOP-9114?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13820811#comment-13820811
 ] 

Vinay commented on HADOOP-9114:
-------------------------------

I am not able to assign this issue to sathish. May be contributor addition 
required in jira?

> After defined the dfs.checksum.type as the NULL, write file and hflush will 
> through java.lang.ArrayIndexOutOfBoundsException
> ----------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-9114
>                 URL: https://issues.apache.org/jira/browse/HADOOP-9114
>             Project: Hadoop Common
>          Issue Type: Bug
>    Affects Versions: 2.0.1-alpha
>            Reporter: liuyang
>            Priority: Minor
>         Attachments: FSOutputSummer.java.patch, HADOOP-9114-001.patch, 
> HADOOP-9114-002.patch
>
>
> when I test the characteristic parameter about dfs.checksum.type. The value 
> can be defined as NULL,CRC32C,CRC32. It's ok when the value is CRC32C or 
> CRC32, but the client will through java.lang.ArrayIndexOutOfBoundsException 
> when the value is configured NULL.



--
This message was sent by Atlassian JIRA
(v6.1#6144)

Reply via email to