[ https://issues.apache.org/jira/browse/HADOOP-9114?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13823348#comment-13823348 ]
Uma Maheswara Rao G commented on HADOOP-9114: --------------------------------------------- +1 > After defined the dfs.checksum.type as the NULL, write file and hflush will > through java.lang.ArrayIndexOutOfBoundsException > ---------------------------------------------------------------------------------------------------------------------------- > > Key: HADOOP-9114 > URL: https://issues.apache.org/jira/browse/HADOOP-9114 > Project: Hadoop Common > Issue Type: Bug > Affects Versions: 2.0.1-alpha > Reporter: liuyang > Assignee: sathish > Priority: Minor > Attachments: FSOutputSummer.java.patch, HADOOP-9114-001.patch, > HADOOP-9114-002.patch > > > when I test the characteristic parameter about dfs.checksum.type. The value > can be defined as NULL,CRC32C,CRC32. It's ok when the value is CRC32C or > CRC32, but the client will through java.lang.ArrayIndexOutOfBoundsException > when the value is configured NULL. -- This message was sent by Atlassian JIRA (v6.1#6144)