[ 
https://issues.apache.org/jira/browse/HBASE-1590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12726239#action_12726239
 ] 

Nitay Joffe commented on HBASE-1590:
------------------------------------

What if you maintain a Set<Object> of references that have been counted 
already. That way you can traverse any data structure and check if you need to 
recurse. For example, when you get to the 'parent' reference you'll see it has 
already been counted so you just count the reference itself without recursing 
into it. 

> Extend TestHeapSize and ClassSize to do "deep" sizing of Objects
> ----------------------------------------------------------------
>
>                 Key: HBASE-1590
>                 URL: https://issues.apache.org/jira/browse/HBASE-1590
>             Project: Hadoop HBase
>          Issue Type: Improvement
>    Affects Versions: 0.20.0
>            Reporter: Jonathan Gray
>             Fix For: 0.20.0
>
>
> As discussed in HBASE-1554 there is a bit of a disconnect between how 
> ClassSize calculates the heap size and how we need to calculate heap size in 
> our implementations.
> For example, the LRU block cache can be sized via ClassSize, but it is a 
> shallow sizing.  There is a backing ConcurrentHashMap that is the largest 
> memory consumer.  However, ClassSize only counts that as a single reference.  
> But in our heapSize() reporting, we want to include *everything* within that 
> Object.
> This issue is to resolve that dissonance.  We may need to create an 
> additional ClassSize.estimateDeep(), we may need to rethink our HeapSize 
> interface, or maybe just leave it as is.  The two primary goals of all this 
> testing is to 1) ensure that if something is changed and the sizing is not 
> updated, our tests fail, and 2) ensure our sizing is as accurate as possible.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to