I've graphed the rate of change of the TotalReadLatencyMicros counter over the 
last 12 hours, and divided by 1,000,000 to get it in seconds.  I'm grabbing it 
every 10 seconds, so I divided by another 10 to get per-second rates.

The result is that I have a CF doing 10 seconds of read *every second*.

Does that make sense?

If I divide it by the number of reads done, it matches up with the latency I'm 
seeing from cfstats:  1.5ms/read.

Reply via email to