On Mon, 25 Aug 2008 10:21:54 -0400
"John T. Dow" <[EMAIL PROTECTED]> wrote:

> By "bad data", I mean a character that's not UTF8, such as hex 98.
> 
> As far as I can tell, pg_dump is the tool to use. But it has
> serious drawbacks.
> 
> If you dump in the custom format, the data is compressed (nice) and
> includes large objects (very nice). But, from my tests and the
> postings of others, if there is invalid data in a table, although
> PostgreSQL won't complain and pg_dump won't complain, pg_restore will
> strenuously object, rejecting all rows for that particular table (not
> nice at all).

You can use the TOC feature of -Fc to remove restoring of that single
table. You can then convert that single table to a plain text dump and
clean the data. Then restore it separately.

If you have foregin keys and indexes on the bad data table, don't
restore the keys until *after* you have done the above.

Sincerely,

Joshua D. Drake

-- 
The PostgreSQL Company since 1997: http://www.commandprompt.com/ 
PostgreSQL Community Conference: http://www.postgresqlconference.org/
United States PostgreSQL Association: http://www.postgresql.us/
Donate to the PostgreSQL Project: http://www.postgresql.org/about/donate



-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to