On 12 Oct 2001, Doug McNaught wrote:

> Joseph Koenig <[EMAIL PROTECTED]> writes:
>
> > I have a project where a client has products stored in a large Progress
> > DB on an NT server. The web server is a FreeBSD box though, and the
> > client wants to try to avoid the $5,500 license for the Unlimited
> > Connections via OpenLink software and would like to take advantage of
> > the 'free' non-expiring 2 connection (concurrent) license. This wouldn't
> > be a huge problem, but the DB can easily reach 1 million records. Is
> > there any good way to pull this data out of Progess and get it into
> > Postgres? This is way too large of a db to do a "SELECT * FROM table"
> > and do an insert for each row. Any brilliant ideas? Thanks,
>
> Probably the best thing to do is to export the data from Progress in a
> format that the PostgreSQL COPY command can read.  See the docs for
> details.

I'm going to have to rant now. The "dump" and "restore" which use the COPY
method are actually totally useless for large databases. The reason for
this is simple. Copying a 4 GB table with 40M rows requires over 40GB of
temporary scratch space to copy, due to the WAL temp files. That sounds
totally silly. Why doesn't pg_dump insert commits every 1000 rows or so???

Cheers.

Gordan


---------------------------(end of broadcast)---------------------------
TIP 2: you can get off all lists at once with the unregister command
    (send "unregister YourEmailAddressHere" to [EMAIL PROTECTED])

Reply via email to