[ 
https://issues.apache.org/jira/browse/HDFS-94?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12776711#action_12776711
 ] 

Tsz Wo (Nicholas), SZE commented on HDFS-94:
--------------------------------------------

> I wonder why it shows a total of 17.78GB instead of 20GB

Would it be the case that you have hit some limit?  The following is quoted 
from [java man 
page|http://java.sun.com/javase/6/docs/technotes/tools/solaris/java.html]:

{quote}
On Solaris 7 and Solaris 8 SPARC platforms, the upper limit for this value is 
approximately 4000m minus overhead amounts. On Solaris 2.6 and x86 platforms, 
the upper limit is approximately 2000m minus overhead amounts. On Linux 
platforms, the upper limit is approximately 2000m minus overhead amounts. 
{quote}

> The "Heap Size" in HDFS web ui may not be accurate
> --------------------------------------------------
>
>                 Key: HDFS-94
>                 URL: https://issues.apache.org/jira/browse/HDFS-94
>             Project: Hadoop HDFS
>          Issue Type: Bug
>            Reporter: Tsz Wo (Nicholas), SZE
>
> It seems that the Heap Size shown in HDFS web UI is not accurate.  It keeps 
> showing 100% of usage.  e.g.
> {noformat}
> Heap Size is 10.01 GB / 10.01 GB (100%) 
> {noformat}

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to