On May 3, 2006, at 8:18 AM, Michael Stone wrote:

On Tue, May 02, 2006 at 08:09:52PM -0600, Brendan Duddridge wrote:
-------Sequential Output-------- ---Sequential Input-- --Random-- -Per Char- --Block--- -Rewrite-- -Per Char- -- Block--- --Seeks--- Machine MB K/sec %CPU K/sec %CPU K/sec %CPU K/sec %CPU K/sec % CPU /sec %CPU 0 40365 99.4 211625 61.4 212425 57.0 50740 99.9 730515 100.0 45897.9 190.1
[snip]
Do these numbers seem decent enough for a Postgres database?

These numbers seem completely bogus, probably because bonnie is using a file size smaller than memory and is reporting caching effects. (730MB/s isn't possible for a single external RAID unit with a pair of 2Gb/s interfaces.) bonnie in general isn't particularly useful on modern large-ram systems, in my experience.


Bonnie++ is able to use very large datasets. It also tries to figure out hte size you want (2x ram) - the original bonnie is limited to 2GB.

--
Jeff Trout <[EMAIL PROTECTED]>
http://www.jefftrout.com/
http://www.stuarthamm.net/



---------------------------(end of broadcast)---------------------------
TIP 6: explain analyze is your friend

Reply via email to