Thanks Doug,
I was doing an UPATE on 100 million rows and I was updating an indexed column, it was
also
the column that I was basing my seach on.
UPDATE "Calls" SET "GroupCode"='100 my street' WHERE "GroupCode"='' AND "Site"='05'
GroupCode was Indexed. I dropped the index and the query ra
I need to do an UPDATE on a large (100 million record) table. Is there
any way to speed up the process (Like turning off the transaction log)?
So far postgres has been handling the large database exceptionally well
(large \copy imports and WHERE clauses w/ multiple params) but it is
killing me o