> > [...] > > There has to be a more linear way of handling this scenario. > > So vacuum the table often.
Good advice, except if the table is huge :-) Here we have for example some tables which are frequently updated but contain >100 million rows. Vacuuming that takes hours. And the dead row candidates are the ones which are updated again and again and looked up frequently... A good solution would be a new type of vacuum which does not need to do a full table scan but can clean the pending dead rows without that... I guess then I could vacuum really frequently those tables. Cheers, Csaba. ---------------------------(end of broadcast)--------------------------- TIP 4: Have you searched our list archives? http://archives.postgresql.org