So,

Take obytes64/rbytes64 at 2 different time periods, let's say 5 seconds apart. 
Add the difference in bytes to get the total bytes transfered during the time 
period.

Divide by 5 secs and you have the bytes/sec throughput.  From here on you can 
just divide the Gbit speed.

A 1Gbit interface does 128MB/s ... but isn't this theoretical? Yes, I've seen 
12MB/s on 10Mbit ethernet... but I seldom see >100MB/s on Gbit ethernet. What 
would be the correct value to consider as 100% utilization?
 
 
This message posted from opensolaris.org
_______________________________________________
networking-discuss mailing list
[email protected]

Reply via email to