hi,
  we have a weird situation here.  we have a table of approx. 10k rows
representing accumulated activity by specific customers.  as information 
is gathered those customers rows are updated.  the number of rows does not
increase unless we get a new customer so that is not a factor.  the table 
is defined as follows:

      Table "account_summary_02"
  Attribute  |    Type     | Modifier
-------------+-------------+----------
 bill_br_id  | bigint      | not null
 cust_id     | varchar(15) | not null
 btn_id      | varchar(15) | not null
 ln_id       | varchar(15) | not null
 ct_key      | float8      | not null
 as_quantity | float8      | not null
 as_charges  | float8      | not null
 as_count    | float8      | not null
Index: account_summary_02_unq_idx

the index is on the first 5 columns.  here's the situation.  after about
50,000
updates, which fly right along, the process begins to really bog down.  we
perform 
a vacuum analzye and it speeds right up again.  my question is, is there a way
to perform these updates, potentially 500k to 1 million in a day, without
having 
to vacuum so frequently?  maybe some setting or parameter to be changed?
the update
query is doing an index scan.  

mikeo 

Reply via email to