Hi clint

Yes. Linux and this script looks good. We've think that part of the problem 
is in the modules Apache is loading so this will be useful.

I also have another couple of questions:

I have found the errant code where our process jumps by 13 Mbs. One part 
does something like this:

$file_handle->read($s, $length); #$s is  about 1/2 Mb
@data = unpack($format , $s);
##at this point memory usage jumps by 8 Mbs (measured using GTop->size() )

while (@data) {
push @data2, [shift @data, shift @data, shift @data] ;  # this isn't exact 
but it looks like each element of @data2 becomes a reference to a 3 element 
array - i.e the binary data was stored in triplets
}
#this loop causes another jump of 4 Mbs

return \...@data2;

I tried undef'ing @data just before the return as it is no longer used but 
this only gained me 1/2 Mb. I would have expected to get all 8Mbs back. I 
don't understand why not.


Also - in general terms if you do something like this (simplified):

package MyHandler;

use MyClass;

sub handler {
my $obj = MyClass->new();
}

.......
package MyClass;

our $var;
sub new() {
$var = "hello world";
}


\Since the module containing the package MyClass is loaded into the 
apache/mod_perl process does $var ever go out of scope once set? I think 
not - and it's memory is never freed? if this is correct and I used my 
instead even then would it go out scope ?


Thank you for your patience.

regards

Justin






> Hi Justin
>
>>
>> I'm wondering if anyone can advise me on how I could go about trying
>> to understand where this 90 Mbs is comming from? Some of it must be
>> the mod_perl and apache binaries - but how much should they be, and
>> apart from the 6mb in shared memory for my pre-loaded modules, where
>> is it all comming from?
>
> You don't mention what platform you are on, but if it is linux, then
> check out this blog
>
>   http://bmaurer.blogspot.com/2006/03/memory-usage-with-smaps.html
>
> and specifically this script
>
>   http://www.contrib.andrew.cmu.edu/%7Ebmaurer/memory/smem.pl
>
>
> which will give you a much more accurate idea about how much memory is
> shared and where it is being used.
>
> As Perrin said, Perl data and code both go on the heap, so you won't be
> able to separate those two out with this tool, but combining smem.pl
> with loading modules one by one will get you a long way to a diagnosis.
>
> clint 

Reply via email to