Hi Jodi,

> None of the data is actually committed to the database until the scripts
> complete so I believe that autocommit is turned off.
> 
what if you try to write the output of your script into a separate file
and pipe it to a psql as input? What i mean is to strip of the processing
time for the "excel-part". Still 25 minutes to do the job?

We often insert data in the same amount (10.000 - 100.000 rows per job)
within a few seconds/minutes. A few months ago I had the same problem 
"writing" a dbf-file from postgres-data: The select-statement took 
milliseconds, but the conversion into db-format seems to be endless. 

BTW: We also had a table (10.000s of rows / daily vacuumed) which was
rather slow during inserts (PostgreSQL 7.1.2). After upgrading to
version 7.1.3 and completely rebuild the tables, the problem went away.

Hope it helps
R. Luettecke

-- 
MICHAEL TELECOM AG
Bruchheide 34 - 49163 Bohmte
Fon: +49 5471 806-0
[EMAIL PROTECTED]
http://www.michael-telecom.de

---------------------------(end of broadcast)---------------------------
TIP 4: Don't 'kill -9' the postmaster

Reply via email to