Mark Dexter wrote:
We use a development environment that works with Postgres via ODBC and
uses cursors to insert and update rows in Postgres tables.  I'm using
Postgres version 7.4.5.

A. If I TRUNCATE or DELETE all of the rows in the table and then run
VACUUM or ANALYZE on the empty table, the test program takes over 15
minutes to complete (i.e., 15X performance drop).

If we routinely run VACUUM or VACUUM ANALYZE (e.g., nightly), these work
tables will normally be empty when the VACUUM is run.  So it would
appear from the testing above that they will experience performance
problems when inserting large numbers of rows  through our application.

Yep - it's a known issue. The analyse is doing what you asked, it's just not what you want.


Is there some easy way around this problem?  If there a way to force
VACUUM or ANALYZE to optimize for a set number of rows even if the table
is empty when it is run?  Thanks for your help.   Mark

There are only two options I know of: 1. Vaccum analyse each table separately (tedious, I know) 2. Try pg_autovacuum in the contrib/ directory

The autovacuum utility monitors activity for you and targets tables when they've seen a certain amount of activity. Even if it hasn't got the tunability you need, it should be a simple patch to add a list of "excluded" tables.

--
  Richard Huxton
  Archonet Ltd

---------------------------(end of broadcast)---------------------------
TIP 7: don't forget to increase your free space map settings

Reply via email to