On Wed, 6 May 2009, Richard Elling wrote:

Memory is meant to be used. 96% RAM use is good since it represents an effective use of your investment.

Actually, I think a percentage of RAM is a bogus metric to measure.
For example, on a 2TBytes system, you would be wasting 80 GBytes.
Perhaps you should look for a more meaningful threshold.

So percent of memory consumed is not a useful efficiency metric? Is this true even if a double precision floating point value is used? :-)

It seems like a more useful measure (on a server) is a caching efficiency metric. If the cache hit ratios are poor yet the cache is continually being loaded with new data, then there may be a resource availability issue.

Bob
--
Bob Friesenhahn
bfrie...@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,    http://www.GraphicsMagick.org/
_______________________________________________
zfs-discuss mailing list
zfs-discuss@opensolaris.org
http://mail.opensolaris.org/mailman/listinfo/zfs-discuss

Reply via email to