Hi all,

I´m running a copy for a 37G CSV and receiving the following error:

"invalid string enlargement request size 65536"

The file has about 70 million lines with 101 columns, all them varchar.

When I run the command with the whole file i receive the error after loading
about 29million lines. So i´ve spllited the file in 10 million lines with
split:

split --lines=10000000

And running the copy i receive the error on the 5th file:

psql:/srv/www/htdocs/import/script_q2.sql:122: ERROR:  invalid string
enlargement request size 65536
CONTEXT:  COPY temp_q2, line 3509639: ""000000009367276";"4";"DANIEL DO
CARMO BARROS";"31-Jan-1986";"M";"1";"10";"3162906";"GILSON TEIXEIRA..."

Any clues?

My postgresql version is 8.2.4 the server is running suse linux with 1.5GB
Sensitive changes in postgresql.conf are:

shared_buffers = 512MB
temp_buffers = 256MB
checkpoint_segments = 60

I´d also like to know if there´s any way to optimize huge data load in
operations like these.

Regards

Adonias Malosso

Reply via email to