Hi folks,

In a Perl application I would like to frequently bulk load
several hundred thousand rows of data into a temporary
table before merging the contents into the database proper.
I'm currently doing individual INSERTs into the temporary
table, which has obviously got a significant performance
penalty--the importing takes several minutes even on a
very fast machine.

I'd like to switch to COPY, which should be orders of
magnitude faster.  I see that DBD::Pg has an interface for
this, which looks just fine.  My problem is with how to
escape the data.  I need to use whatever escaping rules
are in use by the server, which I've seen documented in
the manual; but in order to cope with any future changes
to these rules, and ensure identical behaviour, are there
any standard functions I can use to escape the data before
loading it?


Thanks,
Roger

-- 
  .''`.  Roger Leigh
 : :' :  Debian GNU/Linux             http://people.debian.org/~rleigh/
 `. `'   Printing on GNU/Linux?       http://gutenprint.sourceforge.net/
   `-    GPG Public Key: 0x25BFB848   Please GPG sign your mail.

-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to