Re: [PHP] PHP mysql data result set compression

2006-02-06 Thread Chris

There's only one way to find out :)

David Yee wrote:

Thanks guys for clarifying the compression aspect.  Using
mysql_unbuffered_query w/ multipe conenctions sounds nice and simple- though
would this method mean more disk access than multiple limit queries? As far
as speed goes I imagine if I load as big of a dataset as possible into
physical memory w/o disk swapping then that would be the fastest way to do
this?

David

-Original Message-
From: Chris [mailto:[EMAIL PROTECTED]
Sent: Monday, February 06, 2006 4:50 PM
To: David Yee
Cc: 'php-general@lists.php.net'
Subject: Re: [PHP] PHP mysql data result set compression


Hi David,

 From the comments on unbuffered_query:
However, when using different db connections, it all works ofcource ...

So create a second db connection and when you run the insert use that 
instead:


$result2 = mysql_query("insert blah", $dbconnection_two);


client-compress will compress the data on the way to php but then it has 
to be uncompressed etc (this won't affect much if you're doing it to a 
local mysql server though, it's more for network servers).



David Yee wrote:


Thanks guys- I think I'll have to do multiple queries using LIMIT as Geoff
suggested since apparently mysql_unbuffered_query() would lose the result
set of the "select * from" query once I run the insert query.  I'm still


not


sure why the MYSQL_CLIENT_COMPRESS didn't seem to have an effect, however.

David

-Original Message-
From: Chris [mailto:[EMAIL PROTECTED]
Sent: Monday, February 06, 2006 4:16 PM
To: David Yee
Cc: 'php-general@lists.php.net'
Subject: Re: [PHP] PHP mysql data result set compression


Hi David,

See http://www.php.net/mysql_unbuffered_query

It won't load the whole lot into memory before returning it to php.

David Yee wrote:



Hi all- is there a way have a large data result set from MySQL compressed?
E.g. I have a table with over a million rows of data that I want to do a
"select * from " on and then take that result, do some field/data
manpulation, and then insert row-by-row to another table.  The problem is
the result of the query is so big that it's casuing PHP to swap to disk,
causing things to slow to a crawl.  Doing a "show processlist" on the


mysql



console shows that "Writing to net" is the state of the running "select *



from " query.  I tried adding the flag "MYSQL_CLIENT_COMPRESS" to both



mysql_pconnect() and mysql_connect() but it doesn't seem to do any
compression (I can tell by the size of the running php memory process).


Any



ideas would be appreciated- thanks.

David








--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP] PHP mysql data result set compression

2006-02-06 Thread David Yee
Thanks guys for clarifying the compression aspect.  Using
mysql_unbuffered_query w/ multipe conenctions sounds nice and simple- though
would this method mean more disk access than multiple limit queries? As far
as speed goes I imagine if I load as big of a dataset as possible into
physical memory w/o disk swapping then that would be the fastest way to do
this?

David

-Original Message-
From: Chris [mailto:[EMAIL PROTECTED]
Sent: Monday, February 06, 2006 4:50 PM
To: David Yee
Cc: 'php-general@lists.php.net'
Subject: Re: [PHP] PHP mysql data result set compression


Hi David,

 From the comments on unbuffered_query:
However, when using different db connections, it all works ofcource ...

So create a second db connection and when you run the insert use that 
instead:

$result2 = mysql_query("insert blah", $dbconnection_two);


client-compress will compress the data on the way to php but then it has 
to be uncompressed etc (this won't affect much if you're doing it to a 
local mysql server though, it's more for network servers).


David Yee wrote:
> Thanks guys- I think I'll have to do multiple queries using LIMIT as Geoff
> suggested since apparently mysql_unbuffered_query() would lose the result
> set of the "select * from" query once I run the insert query.  I'm still
not
> sure why the MYSQL_CLIENT_COMPRESS didn't seem to have an effect, however.
> 
> David
> 
> -Original Message-
> From: Chris [mailto:[EMAIL PROTECTED]
> Sent: Monday, February 06, 2006 4:16 PM
> To: David Yee
> Cc: 'php-general@lists.php.net'
> Subject: Re: [PHP] PHP mysql data result set compression
> 
> 
> Hi David,
> 
> See http://www.php.net/mysql_unbuffered_query
> 
> It won't load the whole lot into memory before returning it to php.
> 
> David Yee wrote:
> 
>>Hi all- is there a way have a large data result set from MySQL compressed?
>>E.g. I have a table with over a million rows of data that I want to do a
>>"select * from " on and then take that result, do some field/data
>>manpulation, and then insert row-by-row to another table.  The problem is
>>the result of the query is so big that it's casuing PHP to swap to disk,
>>causing things to slow to a crawl.  Doing a "show processlist" on the
> 
> mysql
> 
>>console shows that "Writing to net" is the state of the running "select *
>>from " query.  I tried adding the flag "MYSQL_CLIENT_COMPRESS" to both
>>mysql_pconnect() and mysql_connect() but it doesn't seem to do any
>>compression (I can tell by the size of the running php memory process).
> 
> Any
> 
>>ideas would be appreciated- thanks.
>>
>>David
>>
> 
> 

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] PHP mysql data result set compression

2006-02-06 Thread Chris

Hi David,

From the comments on unbuffered_query:
However, when using different db connections, it all works ofcource ...

So create a second db connection and when you run the insert use that 
instead:


$result2 = mysql_query("insert blah", $dbconnection_two);


client-compress will compress the data on the way to php but then it has 
to be uncompressed etc (this won't affect much if you're doing it to a 
local mysql server though, it's more for network servers).



David Yee wrote:

Thanks guys- I think I'll have to do multiple queries using LIMIT as Geoff
suggested since apparently mysql_unbuffered_query() would lose the result
set of the "select * from" query once I run the insert query.  I'm still not
sure why the MYSQL_CLIENT_COMPRESS didn't seem to have an effect, however.

David

-Original Message-
From: Chris [mailto:[EMAIL PROTECTED]
Sent: Monday, February 06, 2006 4:16 PM
To: David Yee
Cc: 'php-general@lists.php.net'
Subject: Re: [PHP] PHP mysql data result set compression


Hi David,

See http://www.php.net/mysql_unbuffered_query

It won't load the whole lot into memory before returning it to php.

David Yee wrote:


Hi all- is there a way have a large data result set from MySQL compressed?
E.g. I have a table with over a million rows of data that I want to do a
"select * from " on and then take that result, do some field/data
manpulation, and then insert row-by-row to another table.  The problem is
the result of the query is so big that it's casuing PHP to swap to disk,
causing things to slow to a crawl.  Doing a "show processlist" on the


mysql


console shows that "Writing to net" is the state of the running "select *
from " query.  I tried adding the flag "MYSQL_CLIENT_COMPRESS" to both
mysql_pconnect() and mysql_connect() but it doesn't seem to do any
compression (I can tell by the size of the running php memory process).


Any


ideas would be appreciated- thanks.

David






--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP] PHP mysql data result set compression

2006-02-06 Thread Robert Cummings
On Mon, 2006-02-06 at 19:39, David Yee wrote:
> I'm still not sure why the MYSQL_CLIENT_COMPRESS didn't seem to have an effect

That causes the data to be transfered from the MySQL server to the
client with compression. The results are still uncompressed on the
client.

Cheers,
Rob.
-- 
..
| InterJinn Application Framework - http://www.interjinn.com |
::
| An application and templating framework for PHP. Boasting  |
| a powerful, scalable system for accessing system services  |
| such as forms, properties, sessions, and caches. InterJinn |
| also provides an extremely flexible architecture for   |
| creating re-usable components quickly and easily.  |
`'

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP] PHP mysql data result set compression

2006-02-06 Thread David Yee
Thanks guys- I think I'll have to do multiple queries using LIMIT as Geoff
suggested since apparently mysql_unbuffered_query() would lose the result
set of the "select * from" query once I run the insert query.  I'm still not
sure why the MYSQL_CLIENT_COMPRESS didn't seem to have an effect, however.

David

-Original Message-
From: Chris [mailto:[EMAIL PROTECTED]
Sent: Monday, February 06, 2006 4:16 PM
To: David Yee
Cc: 'php-general@lists.php.net'
Subject: Re: [PHP] PHP mysql data result set compression


Hi David,

See http://www.php.net/mysql_unbuffered_query

It won't load the whole lot into memory before returning it to php.

David Yee wrote:
> Hi all- is there a way have a large data result set from MySQL compressed?
> E.g. I have a table with over a million rows of data that I want to do a
> "select * from " on and then take that result, do some field/data
> manpulation, and then insert row-by-row to another table.  The problem is
> the result of the query is so big that it's casuing PHP to swap to disk,
> causing things to slow to a crawl.  Doing a "show processlist" on the
mysql
> console shows that "Writing to net" is the state of the running "select *
> from " query.  I tried adding the flag "MYSQL_CLIENT_COMPRESS" to both
> mysql_pconnect() and mysql_connect() but it doesn't seem to do any
> compression (I can tell by the size of the running php memory process).
Any
> ideas would be appreciated- thanks.
> 
> David
> 

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] PHP mysql data result set compression

2006-02-06 Thread Chris

Hi David,

See http://www.php.net/mysql_unbuffered_query

It won't load the whole lot into memory before returning it to php.

David Yee wrote:

Hi all- is there a way have a large data result set from MySQL compressed?
E.g. I have a table with over a million rows of data that I want to do a
"select * from " on and then take that result, do some field/data
manpulation, and then insert row-by-row to another table.  The problem is
the result of the query is so big that it's casuing PHP to swap to disk,
causing things to slow to a crawl.  Doing a "show processlist" on the mysql
console shows that "Writing to net" is the state of the running "select *
from " query.  I tried adding the flag "MYSQL_CLIENT_COMPRESS" to both
mysql_pconnect() and mysql_connect() but it doesn't seem to do any
compression (I can tell by the size of the running php memory process).  Any
ideas would be appreciated- thanks.

David



--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] PHP mysql data result set compression

2006-02-06 Thread Geoff
On 6 Feb 2006 at 16:03, David Yee wrote:

> Hi all- is there a way have a large data result set from MySQL compressed?
> E.g. I have a table with over a million rows of data that I want to do a
> "select * from " on and then take that result, do some field/data
> manpulation, and then insert row-by-row to another table.  The problem is
> the result of the query is so big that it's casuing PHP to swap to disk,
> causing things to slow to a crawl.  Doing a "show processlist" on the mysql
> console shows that "Writing to net" is the state of the running "select *
> from " query.  I tried adding the flag "MYSQL_CLIENT_COMPRESS" to both
> mysql_pconnect() and mysql_connect() but it doesn't seem to do any
> compression (I can tell by the size of the running php memory process).  Any
> ideas would be appreciated- thanks.

You could try using the LIMIT keyword with an offset number to get 
records in more manageble chunks, then write out each chunk, freeing 
its resources before loading the next one.   

Geoff.  

> 
> David
> 
> -- 
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
> 

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



[PHP] PHP mysql data result set compression

2006-02-06 Thread David Yee
Hi all- is there a way have a large data result set from MySQL compressed?
E.g. I have a table with over a million rows of data that I want to do a
"select * from " on and then take that result, do some field/data
manpulation, and then insert row-by-row to another table.  The problem is
the result of the query is so big that it's casuing PHP to swap to disk,
causing things to slow to a crawl.  Doing a "show processlist" on the mysql
console shows that "Writing to net" is the state of the running "select *
from " query.  I tried adding the flag "MYSQL_CLIENT_COMPRESS" to both
mysql_pconnect() and mysql_connect() but it doesn't seem to do any
compression (I can tell by the size of the running php memory process).  Any
ideas would be appreciated- thanks.

David

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php