On Wed, 21 Feb 2007 21:58:33 -
"Greg Sabino Mullane" <[EMAIL PROTECTED]> wrote:
> SELECT 'vacuum verbose analyze
> '||quote_ident(nspname)||'.'||quote_ident(relname)||';'
> FROM pg_class c, pg_namespace n
> WHERE relkind = 'r'
> AND relnamespace = n.oid
> AND nspname = 'novac'
> ORD
-BEGIN PGP SIGNED MESSAGE-
Hash: RIPEMD160
A minor correction to my earlier post: I should have specified the
schema as well in the vacuum command for tables with the same
name in different schemas:
SET search_path = 'pg_catalog';
SELECT set_config('search_path',
current_setting('se
-BEGIN PGP SIGNED MESSAGE-
Hash: RIPEMD160
> Take a really different approach. Log in CSV format to text files
> instead, And only import the date ranges we need "on demand" if a report
> is requested on the data.
Seems like more work than a separate database to me. :)
> 2. We could fi
Our application has a table that is only logged to, and infrequently
used for reporting. There generally no deletes and updates.
Recently, the shear size (an estimated 36 million rows) caused a serious
problem because it prevented a "vacuum analyze" on the whole database
from finishing in a timely