[snip]
So you think it's more efficient and faster to load a 3 - 5 thousand row
table into an array in memory and pass that around to all of your scripts
(through sessions?), rather than just passing a $page variable and doing a
query to return 30 rows on each page??

If you pass a $Page variable, you can make your query like this:

SELECT * FROM table LIMIT $Page*30,30

Just increment and decriment $Page as you traverse the results...easy, eh?
[/snip]

It's definitely faster, as for more efficient I would have to do benchmarks.
The original table consists of millions of rows and each time you query with
LIMIT the query traverses the entire set of records in the data to get the
proper CONDITIONS. Given that there are 3k -5k rows amongst the millions
this requires a lot of search time for each query. The memory footprint of
the 3k - 5k of records, even if the total memory needed for each record is
1k (which it is not), is 30k - 50k RAM, less than the size of most web
pages. The LIMIT query, running on a slow server to simulate dial-up
connections, takes anywhere from 1.3 to 2.2 minutes (been timing it a lot
today) to execute. Since efficiency is often lumped in with speed, I would
have to surmise that using an array in this instance would be more efficient
as well.

Thanks!

Jay






-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to