John Pettitt writes:

> > > Actually no - I specifically don't want to go though a tar/untar step
> > > because for some reason on my FreeBSD 5.4 box BackupPC_tarCreate has a
> > > memory leak that causes it to fail after a few thousand files.   I was
> > > looking for a way round that bug.
> >
> >you're probably already way on top of this, but a bug in
> >BackupPC_tarCreate that was suspected to be a memory leak turned
> >out (sometime within the last month) to be a perl typo in the
> >routine that writes the tar archive.
> >
> >so, are you sure it's a memory leak?  
> >
> >okay, i just went and checked my mail archives, and see from your
> >original symptoms that it does indeed look a whole lot like a
> >memory leak, but it's probably worth trying the fix anyway, if
> >you haven't already done so.  (it's easy:  at line 230 in
> >BackupPC_tarCreate, change "my $fh = @_;" to "my ($fh) = @_;".)
> >
> >paul
> >
> >  
> >
> I don't think that's it (although I did apply the patch) - there are no
> hardlinks in the data set in question so that shouldn't trigger the leak
> - what I'm seeing is an increase in the size of the perl process that's
> running theBackupPC_tarCreate that seem to be proportional to the number
> of files processed (that is it grows slower with big files than it does
> with small ones).  
> 
> Does anybody have any magic perl tricks to catch memory leaks that
> they'd like to share?

I have also seen the memory leak problem on my setup.  I think
it is unrelated to the hard-link bug mentioned above.

To debug it I built a version of perl with -DPERL_DEBUGGING_MSTATS
and by using the Devel::Peek module you can then call the mstats()
function to get detailed memory allocation information.

I wasn't able to localize the problem.  My tentative conclusion
is that there isn't a memory leak, but instead there is memory
fragmentation leading to the use of more and more memory.  That
conclusion isn't certain, but that's as far as I got.  I wasn't
able to make the behavior better by pre-allocating memory for
certain variables.

Since you don't have hardlinks, you can break your restore into
several pieces (eg: by subdirectory).

Craig


-------------------------------------------------------
SF.Net email is Sponsored by the Better Software Conference & EXPO
September 19-22, 2005 * San Francisco, CA * Development Lifecycle Practices
Agile & Plan-Driven Development * Managing Projects & Teams * Testing & QA
Security * Process Improvement & Measurement * http://www.sqe.com/bsce5sf
_______________________________________________
BackupPC-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/

Reply via email to