frank church wrote:
I am load lots of data via SQL into a database and wrapping it into transactions
speeds it up.
However this fails a number of times. The queries results are logged so it is
easy for me to find problem records.
However a single failure causes the whole transaction to fail.
>
Is there a setting or feature that allows which allows the same performance as
transactions, without causing the whole process to fail, like a delayed updates
or write mechanism of some sort.
Not as it stands. I tend to use a small perl wrapper myself that loads
in batches of e.g. 10000 rows and if there is an error deal with it
separately.
I seem to recall it being discussed as a built-in feature recently
though, so there might be someone working on it for a future version.
It is something I would like to set in that particular data looad.
You might find the "pgloader" project meets your needs exactly:
http://pgfoundry.org/projects/pgloader/
--
Richard Huxton
Archonet Ltd
---------------------------(end of broadcast)---------------------------
TIP 1: if posting/reading through Usenet, please send an appropriate
subscribe-nomail command to [EMAIL PROTECTED] so that your
message can get through to the mailing list cleanly