On Fri, Oct 28, 2011 at 1:38 PM, Jim Long <p...@umpquanet.com> wrote:
> I'm running PHP 5.3.8 on FreeBSD 8.2 with MySQL 5.1.55.
>
> The script below is designed to be able to WHILE it's way through
> a MySQL query result set, and process each row.
>
> However, it runs out of memory a little after a quarter million
> rows.  The schema fields total to about 200 bytes per row, so
> the row size doesn't seem very large.
>
> Why is this running out of memory?
>
> Thank you!
>
> Jim
>
> <?php
>
> $test_db_host = "localhost";
> $test_db_user = "foo";
> $test_db_pwd  = "bar";
> $test_db_name = "farkle";
>
> $db_host = $test_db_host;
> $db_user = $test_db_user;
> $db_name = $test_db_name;
> $db_pwd  = $test_db_pwd;
>
> if (!($db_conn = mysql_connect( $db_host, $db_user, $db_pwd )))
>        die( "Can't connect to MySQL server\n" );
>
> if (!mysql_select_db( $db_name, $db_conn ))
>        die( "Can't connect to database $db_name\n" );
>
> $qry = "select * from test_table order by contract";
>
> if ($result = mysql_query( $qry, $db_conn )) {
>
>        $n = 0;
>        while ($row = mysql_fetch_assoc( $result )) {
> // process row here
>                $n++;
>        } // while
>

Whats the difference between fetch_assoc and fetch_row?

I use:
while ($row = mysql_fetch_row($theQuery)) {
    doCartwheel;
}

on just under 300 million rows and nothing craps out. I have
memory_limit set to 4GB though. Although, IIRC I pushed it up for GD
not mysql issues.

Same OS and php ver, MySQL is 5.1.48

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to