Ernest Lergon wrote: > So I turned it around: > > $col holds now 18 arrays with 14000 entries each and prints the correct > results: ... > and gives: > > SIZE RSS SHARE > 12364 12M 1044 > > Wow, 2 MB saved ;-))
That's pretty good, but obviously not what you were after. I tried using the pre-size array syntax ($#array = 14000), but it didn't help any. Incidentally, that map statement in your script isn't doing anything that I can see. > I think, a reference is a pointer of 8 Bytes, so: > > 14.000 * 8 = approx. 112 KBytes - right? Probably more. Perl data types are complex. They hold a lot of meta data (is the ref blessed, for example). > This doesn't explain the difference of 7 MB calculated and 14 MB > measured. The explanation of this is that perl uses a lot of memory. For one thing, it allocates RAM in buckets. When you hit the limit of the allocated memory, it grabs more, and I believe it grabs an amount in proportion to what you've already used. That means that as your structures get bigger, it grabs bigger chunks. The whole 12MB may not be in use, although perl has reserved it for possible use. (Grabbing memory byte by byte would be less wasteful, but much too slow.) The stuff in perldebguts is the best reference on this, and you've already looked at that. I think your original calculation failed to account for the fact that the minimum numbers given there for scalars are minimums (i.e. scalars with something in them won't be that small) and that you are accessing many of these in more than one way (i.e. as string, float, and integer), which increases their size. You can try playing with compile options (your choice of malloc affects this a little), but at this point it's probably not worth it. There's nothing wrong with 12MB of shared memory, as long as it stays shared. If that doesn't work for you, your only choice will be to trade some speed for reduced memory useage, by using a disk-based structure. At any rate, mod_perl doesn't seem to be at fault here. It's just a general perl issue. - Perrin