Sorry. I omitted an important detail from the previous post:

I have a database of almost 6 million records, and 1.5 Gbyte table
size.   I need to iterate over the table with a Perl script, examining
every record.

This code :

$sth_gnr = $dbh->prepare( 
 'SELECT SQL_BIG_RESULT * FROM Tiger_main '.
 "WHERE rdid > '' ".
 'ORDER BY rdid ' #.
) or die 'bad prepare gnr';

$sth_gnr->execute;

dies (during the execute statement) with the error :

Out of Memory: Killed process 31666 (temp.pl).
Killed

The same query executed by the mysql client dies with an almost 
identical error message.

Is there a fast way to iterate over the database with a succession of
'Select ... ' queries?   A statement like 'Select * from Tiger_main
limit 10000,1' is very slow, and iterating over the entire table would
take weeks.

I am using MySQL server version 3.23.36 under RH Linux 7.1.

Thanks for any support you can offer.

David Keeney
------- End of forwarded message -------
-- 
David Keeney               [EMAIL PROTECTED]
Travel By Road             http://www.travelbyroad.net


---------------------------------------------------------------------
Before posting, please check:
   http://www.mysql.com/manual.php   (the manual)
   http://lists.mysql.com/           (the list archive)

To request this thread, e-mail <[EMAIL PROTECTED]>
To unsubscribe, e-mail <[EMAIL PROTECTED]>
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php

Reply via email to