Susan Russo wrote:
Hi Tom - thanks for the additional/confirming info.

So you definitely have got a problem of needing more vacuuming...

Yes, we're going to nightly, as I said in last message, however, it worse than this.....

I found that *1* vacuum analyze works well in many instances to help optimize query performance (which in one example was running in lightening speed on 2 of our 5 identical software/hardware/configs Pg 8.1.4 servers). However, in several cases, a *2nd* vacuum
analyze was necessary.  (btw - first vacuum was after adjusting
max_fsm_pages, and getting no error msgs from vacuum).

I *think* - please advise, I may be able to affect configs
for a more effective vacuum analyze the first time around (??)
Perhaps an increase to deafult_statistics_target (set to 100??).

I'd read that when performing a vacuum analyze, Pg doesn't actually
go through all values in each table and update statistics, rather,
it samples some of the values and uses that statistical sample. Thus, different runs of the vacuum analyze might generate different
statistics (on different dbs on different servers) since the same db
may be used differently on a different server.   Is this correct??

Thanks for any advice....I'm hoping regular duplicate vacuum
analyze isn't the solution...
Couple -o- points

Update your pg servers. 8.1.9 is out, and there's plenty of bugs fixed between 8.1.4 and 8.1.9 that you should update. It's relatively painless and worth the effort.

I get the feeling you think vacuum and analyze are still married. They're not, they got divorced around 7.3 or so. Used to be to run analyze you needed vacuum. Now you can either one without the other.

Vacuum is more expensive than analyze. Since vacuum reclaims lost tuples, it has to do more work than analyze which only has to do a quick pass over a random sampling of the table, hence you are right in what you heard, that from one run to the next the data analyze returns will usually be a bit different. Increasing the default stats target allows analyze to look at more random samples and get a more accurate report on the values and their distributions in the table. This comes at the cost of slightly greater analyze and query planning times.

---------------------------(end of broadcast)---------------------------
TIP 2: Don't 'kill -9' the postmaster

Reply via email to