[PHP] PHP mysql data result set compression
Hi all- is there a way have a large data result set from MySQL compressed? E.g. I have a table with over a million rows of data that I want to do a select * from on and then take that result, do some field/data manpulation, and then insert row-by-row to another table. The problem is the result of the query is so big that it's casuing PHP to swap to disk, causing things to slow to a crawl. Doing a show processlist on the mysql console shows that Writing to net is the state of the running select * from query. I tried adding the flag MYSQL_CLIENT_COMPRESS to both mysql_pconnect() and mysql_connect() but it doesn't seem to do any compression (I can tell by the size of the running php memory process). Any ideas would be appreciated- thanks. David -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] PHP mysql data result set compression
On 6 Feb 2006 at 16:03, David Yee wrote: Hi all- is there a way have a large data result set from MySQL compressed? E.g. I have a table with over a million rows of data that I want to do a select * from on and then take that result, do some field/data manpulation, and then insert row-by-row to another table. The problem is the result of the query is so big that it's casuing PHP to swap to disk, causing things to slow to a crawl. Doing a show processlist on the mysql console shows that Writing to net is the state of the running select * from query. I tried adding the flag MYSQL_CLIENT_COMPRESS to both mysql_pconnect() and mysql_connect() but it doesn't seem to do any compression (I can tell by the size of the running php memory process). Any ideas would be appreciated- thanks. You could try using the LIMIT keyword with an offset number to get records in more manageble chunks, then write out each chunk, freeing its resources before loading the next one. Geoff. David -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] PHP mysql data result set compression
Hi David, See http://www.php.net/mysql_unbuffered_query It won't load the whole lot into memory before returning it to php. David Yee wrote: Hi all- is there a way have a large data result set from MySQL compressed? E.g. I have a table with over a million rows of data that I want to do a select * from on and then take that result, do some field/data manpulation, and then insert row-by-row to another table. The problem is the result of the query is so big that it's casuing PHP to swap to disk, causing things to slow to a crawl. Doing a show processlist on the mysql console shows that Writing to net is the state of the running select * from query. I tried adding the flag MYSQL_CLIENT_COMPRESS to both mysql_pconnect() and mysql_connect() but it doesn't seem to do any compression (I can tell by the size of the running php memory process). Any ideas would be appreciated- thanks. David -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
RE: [PHP] PHP mysql data result set compression
Thanks guys- I think I'll have to do multiple queries using LIMIT as Geoff suggested since apparently mysql_unbuffered_query() would lose the result set of the select * from query once I run the insert query. I'm still not sure why the MYSQL_CLIENT_COMPRESS didn't seem to have an effect, however. David -Original Message- From: Chris [mailto:[EMAIL PROTECTED] Sent: Monday, February 06, 2006 4:16 PM To: David Yee Cc: 'php-general@lists.php.net' Subject: Re: [PHP] PHP mysql data result set compression Hi David, See http://www.php.net/mysql_unbuffered_query It won't load the whole lot into memory before returning it to php. David Yee wrote: Hi all- is there a way have a large data result set from MySQL compressed? E.g. I have a table with over a million rows of data that I want to do a select * from on and then take that result, do some field/data manpulation, and then insert row-by-row to another table. The problem is the result of the query is so big that it's casuing PHP to swap to disk, causing things to slow to a crawl. Doing a show processlist on the mysql console shows that Writing to net is the state of the running select * from query. I tried adding the flag MYSQL_CLIENT_COMPRESS to both mysql_pconnect() and mysql_connect() but it doesn't seem to do any compression (I can tell by the size of the running php memory process). Any ideas would be appreciated- thanks. David -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
RE: [PHP] PHP mysql data result set compression
On Mon, 2006-02-06 at 19:39, David Yee wrote: I'm still not sure why the MYSQL_CLIENT_COMPRESS didn't seem to have an effect That causes the data to be transfered from the MySQL server to the client with compression. The results are still uncompressed on the client. Cheers, Rob. -- .. | InterJinn Application Framework - http://www.interjinn.com | :: | An application and templating framework for PHP. Boasting | | a powerful, scalable system for accessing system services | | such as forms, properties, sessions, and caches. InterJinn | | also provides an extremely flexible architecture for | | creating re-usable components quickly and easily. | `' -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] PHP mysql data result set compression
Hi David, From the comments on unbuffered_query: However, when using different db connections, it all works ofcource ... So create a second db connection and when you run the insert use that instead: $result2 = mysql_query(insert blah, $dbconnection_two); client-compress will compress the data on the way to php but then it has to be uncompressed etc (this won't affect much if you're doing it to a local mysql server though, it's more for network servers). David Yee wrote: Thanks guys- I think I'll have to do multiple queries using LIMIT as Geoff suggested since apparently mysql_unbuffered_query() would lose the result set of the select * from query once I run the insert query. I'm still not sure why the MYSQL_CLIENT_COMPRESS didn't seem to have an effect, however. David -Original Message- From: Chris [mailto:[EMAIL PROTECTED] Sent: Monday, February 06, 2006 4:16 PM To: David Yee Cc: 'php-general@lists.php.net' Subject: Re: [PHP] PHP mysql data result set compression Hi David, See http://www.php.net/mysql_unbuffered_query It won't load the whole lot into memory before returning it to php. David Yee wrote: Hi all- is there a way have a large data result set from MySQL compressed? E.g. I have a table with over a million rows of data that I want to do a select * from on and then take that result, do some field/data manpulation, and then insert row-by-row to another table. The problem is the result of the query is so big that it's casuing PHP to swap to disk, causing things to slow to a crawl. Doing a show processlist on the mysql console shows that Writing to net is the state of the running select * from query. I tried adding the flag MYSQL_CLIENT_COMPRESS to both mysql_pconnect() and mysql_connect() but it doesn't seem to do any compression (I can tell by the size of the running php memory process). Any ideas would be appreciated- thanks. David -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
RE: [PHP] PHP mysql data result set compression
Thanks guys for clarifying the compression aspect. Using mysql_unbuffered_query w/ multipe conenctions sounds nice and simple- though would this method mean more disk access than multiple limit queries? As far as speed goes I imagine if I load as big of a dataset as possible into physical memory w/o disk swapping then that would be the fastest way to do this? David -Original Message- From: Chris [mailto:[EMAIL PROTECTED] Sent: Monday, February 06, 2006 4:50 PM To: David Yee Cc: 'php-general@lists.php.net' Subject: Re: [PHP] PHP mysql data result set compression Hi David, From the comments on unbuffered_query: However, when using different db connections, it all works ofcource ... So create a second db connection and when you run the insert use that instead: $result2 = mysql_query(insert blah, $dbconnection_two); client-compress will compress the data on the way to php but then it has to be uncompressed etc (this won't affect much if you're doing it to a local mysql server though, it's more for network servers). David Yee wrote: Thanks guys- I think I'll have to do multiple queries using LIMIT as Geoff suggested since apparently mysql_unbuffered_query() would lose the result set of the select * from query once I run the insert query. I'm still not sure why the MYSQL_CLIENT_COMPRESS didn't seem to have an effect, however. David -Original Message- From: Chris [mailto:[EMAIL PROTECTED] Sent: Monday, February 06, 2006 4:16 PM To: David Yee Cc: 'php-general@lists.php.net' Subject: Re: [PHP] PHP mysql data result set compression Hi David, See http://www.php.net/mysql_unbuffered_query It won't load the whole lot into memory before returning it to php. David Yee wrote: Hi all- is there a way have a large data result set from MySQL compressed? E.g. I have a table with over a million rows of data that I want to do a select * from on and then take that result, do some field/data manpulation, and then insert row-by-row to another table. The problem is the result of the query is so big that it's casuing PHP to swap to disk, causing things to slow to a crawl. Doing a show processlist on the mysql console shows that Writing to net is the state of the running select * from query. I tried adding the flag MYSQL_CLIENT_COMPRESS to both mysql_pconnect() and mysql_connect() but it doesn't seem to do any compression (I can tell by the size of the running php memory process). Any ideas would be appreciated- thanks. David -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] PHP mysql data result set compression
There's only one way to find out :) David Yee wrote: Thanks guys for clarifying the compression aspect. Using mysql_unbuffered_query w/ multipe conenctions sounds nice and simple- though would this method mean more disk access than multiple limit queries? As far as speed goes I imagine if I load as big of a dataset as possible into physical memory w/o disk swapping then that would be the fastest way to do this? David -Original Message- From: Chris [mailto:[EMAIL PROTECTED] Sent: Monday, February 06, 2006 4:50 PM To: David Yee Cc: 'php-general@lists.php.net' Subject: Re: [PHP] PHP mysql data result set compression Hi David, From the comments on unbuffered_query: However, when using different db connections, it all works ofcource ... So create a second db connection and when you run the insert use that instead: $result2 = mysql_query(insert blah, $dbconnection_two); client-compress will compress the data on the way to php but then it has to be uncompressed etc (this won't affect much if you're doing it to a local mysql server though, it's more for network servers). David Yee wrote: Thanks guys- I think I'll have to do multiple queries using LIMIT as Geoff suggested since apparently mysql_unbuffered_query() would lose the result set of the select * from query once I run the insert query. I'm still not sure why the MYSQL_CLIENT_COMPRESS didn't seem to have an effect, however. David -Original Message- From: Chris [mailto:[EMAIL PROTECTED] Sent: Monday, February 06, 2006 4:16 PM To: David Yee Cc: 'php-general@lists.php.net' Subject: Re: [PHP] PHP mysql data result set compression Hi David, See http://www.php.net/mysql_unbuffered_query It won't load the whole lot into memory before returning it to php. David Yee wrote: Hi all- is there a way have a large data result set from MySQL compressed? E.g. I have a table with over a million rows of data that I want to do a select * from on and then take that result, do some field/data manpulation, and then insert row-by-row to another table. The problem is the result of the query is so big that it's casuing PHP to swap to disk, causing things to slow to a crawl. Doing a show processlist on the mysql console shows that Writing to net is the state of the running select * from query. I tried adding the flag MYSQL_CLIENT_COMPRESS to both mysql_pconnect() and mysql_connect() but it doesn't seem to do any compression (I can tell by the size of the running php memory process). Any ideas would be appreciated- thanks. David -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] Re: php/mysql data display
Jlake [EMAIL PROTECTED] wrote in message news:[EMAIL PROTECTED] I have a small database that I want to display data from. in such a way that it shows shows in a table with the table header being the department category and the table cells being the categories for each department. I have no problem connecting to the database) I imagine that I will need nested loops, but I haven't seen a tutorial showing quite what I am looking for. If anyone can point me in the right direction that would be great. Maybe this example would be of some help: http://www.templatetamer.org/index.php?MySqlRowList rush -- http://www.templatetamer.com/ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] php/mysql data display
I have a small database that I want to display data from. in such a way that it shows shows in a table with the table header being the department category and the table cells being the categories for each department. I have no problem connecting to the database) I imagine that I will need nested loops, but I haven't seen a tutorial showing quite what I am looking for. If anyone can point me in the right direction that would be great. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
RE: [PHP] php/mysql data display
[snip] I have a small database that I want to display data from. in such a way that it shows shows in a table with the table header being the department category and the table cells being the categories for each department. I have no problem connecting to the database) I imagine that I will need nested loops, but I haven't seen a tutorial showing quite what I am looking for. If anyone can point me in the right direction that would be great. [/snip] Probably just a good query would do. Show us some data in its storage format and then show us an example of what you want the output to be. We can then go from A to B. Until then your question is rather vague. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP] php/mysql data display
Data is as such right now for testing purposes: ID, catName, catLink, catDepartment. I will have multiple instances of the same text in catDepartment. I realize this is a fopaux. I'm just trying to do a quick fix. none the less I want the data formatted like this: DEPARTMENT NAME 1(catDepartment) DEPARTMENT NAME...n(catDepartment) - Category1(catName) Category1(catName) Category2(catName) Category2(catName) Category...n(catName) Category...n(catName) hopefully this makes things less vague. Thanks, J. Jay Blanchard [EMAIL PROTECTED] wrote in message news:[EMAIL PROTECTED] [snip] I have a small database that I want to display data from. in such a way that it shows shows in a table with the table header being the department category and the table cells being the categories for each department. I have no problem connecting to the database) I imagine that I will need nested loops, but I haven't seen a tutorial showing quite what I am looking for. If anyone can point me in the right direction that would be great. [/snip] Probably just a good query would do. Show us some data in its storage format and then show us an example of what you want the output to be. We can then go from A to B. Until then your question is rather vague. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
RE: [PHP] php/mysql data display
[snip] Data is as such right now for testing purposes: ID, catName, catLink, catDepartment. I will have multiple instances of the same text in catDepartment. I realize this is a fopaux. I'm just trying to do a quick fix. none the less I want the data formatted like this: DEPARTMENT NAME 1(catDepartment) DEPARTMENT NAME...n(catDepartment) - Category1(catName) Category1(catName) Category2(catName) Category2(catName) Category...n(catName) Category...n(catName) [/snip] You need a crosstab query in SQL. Please search google for crosstab query. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] mySQL Data Limits
I have a database with a TEXT field in it, and I've been using it this way for over three years... today was the first time an INSERT was truncated on this field... It was an unusually large INSERT, and due to the truncation I'm now has wondering what are the limits (if any) on inserting data into a TEXT field via PHP through an HTML form? All the data was submitted (non-truncated) because an email was composed and sent out based on the form data, but the data in the database did not make it completely... thanx... -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
RE: [PHP] mySQL Data Limits
TEXT has a limit of 65535 characters. BLOB TEXT A BLOB or TEXT column with a maximum length of 65535 (2^16 - 1) characters. See section 6.5.3.1 Silent Column Specification Changes. See section 6.2.3.2 The BLOB and TEXT Types. MEDIUMBLOB MEDIUMTEXT A BLOB or TEXT column with a maximum length of 16777215 (2^24 - 1) characters. See section 6.5.3.1 Silent Column Specification Changes.See section 6.2.3.2 The BLOB and TEXT Types. LONGBLOB LONGTEXT A BLOB or TEXT column with a maximum length of 4294967295 (2^32 - 1)characters. See section 6.5.3.1 Silent Column Specification Changes.Note that because the server/client protocol and MyISAM tables has currently a limit of 16M per communication packet / table row, you can't yet use this the whole range of this type. See section 6.2.3.2 The BLOB and TEXT Types. ---John Holmes... -Original Message- From: JSheble [mailto:[EMAIL PROTECTED]] Sent: Tuesday, April 23, 2002 3:43 PM To: PHP List Subject: [PHP] mySQL Data Limits I have a database with a TEXT field in it, and I've been using it this way for over three years... today was the first time an INSERT was truncated on this field... It was an unusually large INSERT, and due to the truncation I'm now has wondering what are the limits (if any) on inserting data into a TEXT field via PHP through an HTML form? All the data was submitted (non-truncated) because an email was composed and sent out based on the form data, but the data in the database did not make it completely... thanx... -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP] MySQL data
This question has not much with PHP, but I work with PHP... Can I get the data from a table in MySQL and copy/transfer all data to other database to a table with the same name? regards, Augusto Cesar Castoldi -- PHP General Mailing List (http://www.php.net/) To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] To contact the list administrators, e-mail: [EMAIL PROTECTED]
Re: [PHP] MySQL data
Hi: The easiest way to do it is using mysqldump. mysqldump creates a file that contains SQL statements to reconstruct a database, which you can easily execute on the other database. It will create tables with same names and all the records will be copied, too. Seung-woo Nam Augusto Cesar Castoldi wrote: This question has not much with PHP, but I work with PHP... Can I get the data from a table in MySQL and copy/transfer all data to other database to a table with the same name? regards, Augusto Cesar Castoldi -- PHP General Mailing List (http://www.php.net/) To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] To contact the list administrators, e-mail: [EMAIL PROTECTED]
Re: [PHP] MySQL data
There are a few ways to do this... 1.) Copy the binary files over in the mysql data directory ... have to be root and normally at a shell. 2.) SELECT * FROM table and then while() you have rows INSERT INTO other_table those values 3.) mysqldump can be used via PHP and used by most users ... hope this helps ... --Joe On Tue, Apr 17, 2001 at 09:05:02PM -0300, Augusto Cesar Castoldi wrote: This question has not much with PHP, but I work with PHP... Can I get the data from a table in MySQL and copy/transfer all data to other database to a table with the same name? regards, Augusto Cesar Castoldi -- PHP General Mailing List (http://www.php.net/) To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] To contact the list administrators, e-mail: [EMAIL PROTECTED] /* Joe Stump * Sr. PHP Developer * http://www.Care2.com http://www.joestump.net http://gtk.php-coder.net */ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] To contact the list administrators, e-mail: [EMAIL PROTECTED]
RE: [PHP] MySQL data
#3 is the easiest... from command line: # mysqldump -uroot db_from db.sql # mysql -uroot db_to db.sql Now, the entire database was copied. This is the easiest way for the whole database. With single table it is quite similar. Also if the table in db_to already exists, then it might make more sense a loop in PHP to reinsert the staff under your own conditions. Sincerely, Maxim Maletsky Founder, Chief Developer PHPBeginner.com (Where PHP Begins) [EMAIL PROTECTED] www.phpbeginner.com -Original Message- From: Joe Stump [mailto:[EMAIL PROTECTED]] Sent: Wednesday, April 18, 2001 10:48 AM To: Augusto Cesar Castoldi Cc: [EMAIL PROTECTED] Subject: Re: [PHP] MySQL data There are a few ways to do this... 1.) Copy the binary files over in the mysql data directory ... have to be root and normally at a shell. 2.) SELECT * FROM table and then while() you have rows INSERT INTO other_table those values 3.) mysqldump can be used via PHP and used by most users ... hope this helps ... --Joe On Tue, Apr 17, 2001 at 09:05:02PM -0300, Augusto Cesar Castoldi wrote: This question has not much with PHP, but I work with PHP... Can I get the data from a table in MySQL and copy/transfer all data to other database to a table with the same name? regards, Augusto Cesar Castoldi -- PHP General Mailing List (http://www.php.net/) To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] To contact the list administrators, e-mail: [EMAIL PROTECTED] /* Joe Stump * Sr. PHP Developer * http://www.Care2.com http://www.joestump.net http://gtk.php-coder.net */ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] To contact the list administrators, e-mail: [EMAIL PROTECTED] -- PHP General Mailing List (http://www.php.net/) To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] To contact the list administrators, e-mail: [EMAIL PROTECTED]
[PHP] MySQL Data to a HTML-Form
Hi, I got Data from my Database, but how can I show them in a textfiled within a html form? Any ideas? Thanks in advance. -- Best Regards Denis -- PHP General Mailing List (http://www.php.net/) To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] To contact the list administrators, e-mail: [EMAIL PROTECTED]
Re: [PHP] MySQL Data to a HTML-Form
Hi Denis, @ 8:13:41 PM on 4/15/2001, Denis Mettler wrote: I got Data from my Database, but how can I show them in a textfiled within a html form? Any ideas? Show us some code? Assuming you've gotten the data into a $row-column, just echo it into the value of one of the form fields: input type="text" name="foo" value="?php echo $row-column; ?" / -Brian -- PGP is spoken here: 0xE4D0C7C8 Please, DO NOT carbon copy me on list replies. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] To contact the list administrators, e-mail: [EMAIL PROTECTED]