In article <[EMAIL PROTECTED]>,
"Creager, Robert S" <[EMAIL PROTECTED]> wrote:

> I think this is a question regarding the backend, but...

[snip]

> (COPY u FROM stdin). The backend process which handles the db connection
> decides that it needs a whole lot of memory, although in a nice
> controlled manner.  The backend starts with using 6.5Mb, and at 25000
> records copied, it's taken 10Mb and has slowed down substantially. 
> Needless to say, this COPY will not finish before running out of memory
> (estimated 300Mb).  When executing the COPY to the loc table, this
> problem does not occur.  Am I going to have to resort to inserts for the
> referring tables?  

I can't answer the backend question, but how about running
'split' on the big file, then COPYing these smaller files?

Gordon.

-- 
It doesn't get any easier, you just go faster.
   -- Greg LeMond

---------------------------(end of broadcast)---------------------------
TIP 2: you can get off all lists at once with the unregister command
    (send "unregister YourEmailAddressHere" to [EMAIL PROTECTED])

Reply via email to