When performing a query that will return a large dataset (by large I mean
100k+ rows), is it more effecient (memory usage wise) to use the results as
you fetch them or fetch all the results into an array, free the statement
handle, and the process from array? What about performance wise? I am using
perl w/ DBI, but I assume it would be the same if I were using the C api as
well.

I have tried doing "$sth->{"mysql_use_result"} = 1" to see how using
mysql_use_result changes the memory/performance usage but I get odd errors
like:

Issuing rollback() for database handle being DESTROY'd without explicit
disconnect()

and

DBI::db=HASH(0x14652c)->disconnect invalidates 1 active statement handle
(either destroy statement handles or call finish on them before
disconnecting)

despite the code working fine without the mysql_use_result line above. Is
this a known mysql DBI bug?

Does anyone have advice on how to process such large queries?

Thanks,
ryan


---------------------------------------------------------------------
Before posting, please check:
   http://www.mysql.com/manual.php   (the manual)
   http://lists.mysql.com/           (the list archive)

To request this thread, e-mail <[EMAIL PROTECTED]>
To unsubscribe, e-mail <[EMAIL PROTECTED]>
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php

Reply via email to