[ 
https://issues.apache.org/jira/browse/HDFS-14240?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16861710#comment-16861710
 ] 

Shen Yinjie commented on HDFS-14240:
------------------------------------

Sorry for my late  replies.[~RANith] 
I use a real namenode to run “NNThroughputBenchmark -fs hdfs://hc1:8020 -op 
blockReport -datanodes 10 -reports 30 -blocksPerReport 100 -blocksPerFile 10”.

> blockReport test in NNThroughputBenchmark throws 
> ArrayIndexOutOfBoundsException
> -------------------------------------------------------------------------------
>
>                 Key: HDFS-14240
>                 URL: https://issues.apache.org/jira/browse/HDFS-14240
>             Project: Hadoop HDFS
>          Issue Type: Bug
>            Reporter: Shen Yinjie
>            Assignee: Ranith Sardar
>            Priority: Major
>         Attachments: screenshot-1.png
>
>
> _emphasized text_When I run a blockReport test with NNThroughputBenchmark, 
> BlockReportStats.addBlocks() throws ArrayIndexOutOfBoundsException.
> digging the code:
> {code:java}
> for(DatanodeInfo dnInfo : loc.getLocations())
> { int dnIdx = dnInfo.getXferPort() - 1; 
> datanodes[dnIdx].addBlock(loc.getBlock().getLocalBlock());{code}
>  
> problem is here:array datanodes's length is determined by args as 
> "-datanodes" or "-threads" ,but dnIdx = dnInfo.getXferPort() is a random port.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org

Reply via email to