ID:               39736
 User updated by:  cnww at hfx dot andara dot com
 Reported By:      cnww at hfx dot andara dot com
-Status:           Bogus
+Status:           Open
 Bug Type:         Arrays related
 Operating System: Debian GNU/Linux
 PHP Version:      5.2.0
 New Comment:

Yes, obviously the serialized string can be larger than the array in
memory. The issue here is that when you get a sufficiently complex
array the memory required to serialize suddenly balloons out to SEVERAL
HUNDRED TIMES the size of the array.

Just for the heck of it I created some more swap space and tried this
again with a memory limit of 4GB (actually 4095MB, apparently PHP
cannot handle 4096), this allowed the serialize() call to succeed. It
eventually produced a 6.8MB output string (7,147,979 characters to be
exact), and the PHP process' peak memory usage while creating this
6.8MB string was 1,540MB of RAM.

So what is it doing with a ~7MB of data that requires over 1.5GB of
RAM?

I may be missing something, but at the moment I do not see any reason
why the memory requirement for serialize() ought to be anything more
than O(n) on the size of the input. I had a look at the source in
var.c, but nothing jumps out at me as an obvious problem. My best guess
based on the symptoms is that it's somehow copying the buffer and/or
hash table when recursing through an array of arrays.

If this really isn't a bug then I apologize for wasting your time. I'll
polish my own serialize() function a bit and post it as a comment in the
function reference for other people running into this problem.


Previous Comments:
------------------------------------------------------------------------

[2006-12-04 21:26:09] [EMAIL PROTECTED]

Thank you for taking the time to write to us, but this is not
a bug. Please double-check the documentation available at
http://www.php.net/manual/ and the instructions on how to report
a bug at http://bugs.php.net/how-to-report.php

serialize() representation of the data is quite a bit larger 
then it's representation in memory. So a 60mb array could 
easily take 120-180 megs as a serialized string.

------------------------------------------------------------------------

[2006-12-04 21:24:28] cnww at hfx dot andara dot com

Description:
------------
When called on a tree implemented as a multi-dimensional array
occupying something like 60MB of RAM serialize fails with an error
like:

Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried
to allocate 6282420 bytes) in /var/www/lib/metacache.php on line 105

So either the memory allocation system is busted or the serialization
of ~60MB of data (actually the full text version is only 9.1MB)
requires over 1GB of memory.

Reproduce code:
---------------
The line that is actually failing is:

$meta_ser = serialize( &$metacache_buf );

For obvious reasons I have not posted the whole array here. 

I created an example program that reproduces the problem using an array
of 242,919 strings made with var_export() from the directory listing of
a Windows machine's D: drive.

You can get the code from:
http://arachne.k-pcltee.com/~cnww/serialbug.bz2

The example program seems to require around 52MB of RAM to create the
array, so if you're using the default 8MB limit it will fail before it
gets to the serialize() call. Any limit of 64MB or greater seems to
result in it failing during serialize().

Expected result:
----------------
It should print "Exiting normally"

Actual result:
--------------
Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried
to allocate 6010844 bytes) in /tmp/serialbug.php on line 313030


------------------------------------------------------------------------


-- 
Edit this bug report at http://bugs.php.net/?id=39736&edit=1

Reply via email to