Trent Fisher wrote:
I was going to do a dump/load sequence on a large repository to rearrange things. I got an "out of memory" error! At first I thought it was sloppiness in my perl script doing the filtering, but, no, it was "svnadmin dump" which ran out of memory! What's worse is it seems the "out of memory" error was printed on stdout rather than stderr.

I've been able to reproduce this. I've done this three times and each time it gets to the revision 54942 (out of 181168), at that time it is taking up 3.1 gigs of memory (VmSize).

Admittedly, I am doing this on a test machine which doesn't have a ton of memory (4 gigs physical + 10 gigs swap). It would take more to run the main server out of memory (it has 32+16 gigs). But the concerning thing is that it is seems that it is possible to do so, given a large enough repository.

Is this a known problem, or is it just me?

FYI, this is with Subversion 1.6.5 on an Oracle (Red Hat) Enterprise Linux 4 update 7 machine.

++thanks,
trent...



Hi Trent
I have had similar experience with huge repos (150 GB). To handle this have you considered using incremental dumps?
ie svnadmin dump /path/to/repo --revision x:y --incremental > dump1

Justin

Reply via email to