GHC has a multi-generational garbage collector. If you have enough physical memory on your machine so that the GC isn't thrashing trying to find the 100 free bytes that remain, then you should find the database migrates to the oldest generation and stays there. If you use +RTS -Sstderr you'll see info about when GC happens, and which generation. There should be lots of young-gen collections for each old-gen one. You can increase the number of generations with a command-line flag to the runtime system (see the user manual).
Simon | -----Original Message----- | From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] | Sent: 03 March 2003 18:38 | To: [EMAIL PROTECTED] | Subject: Persistent data | | Is there some way to reduce the cost of garbage collection over large persistent | datastructures without resorting to escaping to C to malloc memory outside the | heap? | | The program I'm working is part database, which cannot discard information. | The net result is that I see figures like 82.9% of the time taken by garbage | collection. The heap profile looks like a charging capacitor: a linear increase | (as expected) which is slowly dilated as time increases by the garbage collector | thrashing memory. | | When I worked on the LOLITA natural language processor we solved the problem | by moving a lot of the data out to C++, so that the heap only contains things | soon to be freed. I know generational garbage collection is supposed to help, | but it doesn't seem to. Is there a pure Haskell solution to this problem? | | Sengan | | _______________________________________________ | Haskell mailing list | [EMAIL PROTECTED] | http://www.haskell.org/mailman/listinfo/haskell _______________________________________________ Haskell mailing list [EMAIL PROTECTED] http://www.haskell.org/mailman/listinfo/haskell