Hi,

I've recently migrated from MySQL to PostgreSQL and as impressed as I am
with Postgres, I have found one seemingly missing feature to be a little
bothersome..

'mysqlimport' has the ability to skip duplicate records when doing bulk
imports from non-binary files. PostgreSQL doesn't seem to have this
feature, and it causes a problem for me as I import extremely large
amounts of data into Postgres using 'copy' and it rejects the whole file
if one record breaches the primary key.

I have managed to get around this by hacking
src/backend/access/nbtree/nbtinsert.c to call elog with NOTICE instead of
ERROR, causing it to skip the duplicate record and continue importing.

Is there a way to get around this without changing the code? If not, will
a future release of Postgres optionally implement this?

Thanks in advance,

Steve Micallef


---------------------------(end of broadcast)---------------------------
TIP 2: you can get off all lists at once with the unregister command
    (send "unregister YourEmailAddressHere" to [EMAIL PROTECTED])

Reply via email to