On 6 Feb 2006 at 16:03, David Yee wrote:

> Hi all- is there a way have a large data result set from MySQL compressed?
> E.g. I have a table with over a million rows of data that I want to do a
> "select * from " on and then take that result, do some field/data
> manpulation, and then insert row-by-row to another table.  The problem is
> the result of the query is so big that it's casuing PHP to swap to disk,
> causing things to slow to a crawl.  Doing a "show processlist" on the mysql
> console shows that "Writing to net" is the state of the running "select *
> from " query.  I tried adding the flag "MYSQL_CLIENT_COMPRESS" to both
> mysql_pconnect() and mysql_connect() but it doesn't seem to do any
> compression (I can tell by the size of the running php memory process).  Any
> ideas would be appreciated- thanks.

You could try using the LIMIT keyword with an offset number to get 
records in more manageble chunks, then write out each chunk, freeing 
its resources before loading the next one.   

Geoff.  

> 
> David
> 
> -- 
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
> 

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to