[ 
https://issues.apache.org/jira/browse/HDFS-559?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12747615#action_12747615
 ] 

Konstantin Shvachko commented on HDFS-559:
------------------------------------------

According to my calculations (no compression)

sizeof(BlockInfo) = 72 + 24*replication

This comes out of these
Object header = 16
Three longs in the block = 8*3
INode reference = 8
Array header = 24
Three references for each replica = 24*replication

> Work out the memory consumption of NN artifacts on a compressed pointer JVM
> ---------------------------------------------------------------------------
>
>                 Key: HDFS-559
>                 URL: https://issues.apache.org/jira/browse/HDFS-559
>             Project: Hadoop HDFS
>          Issue Type: Improvement
>          Components: name-node
>    Affects Versions: 0.21.0
>         Environment: 64-bit and 32 bit JVMs, Java6u14 and jdk7 betas, with 
> -XX compressed oops enabled/disabled
>            Reporter: Steve Loughran
>            Assignee: Steve Loughran
>            Priority: Minor
>
> Following up HADOOP-1687, it would be nice to know the size of datatypes in 
> under the java16u14 JVM, which offers compressed pointers.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to