On 25-09-2012 12:49, Marcelo Bianchi wrote:
Dear list,
I developed a script that have to create considerable array of integer
numbers in memory. Currently my script has to accommodate in memory two
arrays of around 120.000 numbers (for one year of data, I would like to
reach a 2 years per query) that are loaded from a mysql server.
I developed the script on a ubuntu 32bit machine and when I bring this
script to our server, a 64bit opensuse the normal 128M of memory limit
from php was exhausted.
On my ubuntu system the script was consuming around 30Mb of memory /
year, on the opensuse to run the same script it consumes more than 90Mb
/ year of memory.
Is there a way to reduce this memory consumption on a 64bits machine ?
Also, I made a test with four different virtual box machines (two
opensuse and two ubuntu machines with the same PHP version, opensuse
11.4 and ubuntu 11.04 all running php-5.3.5) and it was quite choking
the result for me. I am still wandering what is the difference between
those and how to explain the results.
A print screen of my machine running the four virtual boxes can be seen at:
https://sites.google.com/site/foo4funreborn/phpcomp
I would greatly thanks any help,
with my best regards,
Marcelo Bianchi
A while ago, there was a post on the php.internals list with a similar
problem. An explanation was provided why arrays tend to grow huge when
you keep appending small parts to it.
I won't repeat what it was exactly, but you can read the thread here:
http://marc.info/?l=php-internals&m=133762629930776&w=2
It mainly talks about objects at the start, but it's about arrays
further on.
One option it gives, is to use SplFixedArray to fix the total amount of
memory you expect it to use, and thus limit it (a lot).
- Tul
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php