It turns out that the I can reproduce the problem using the Hashtable alone. So this is not a problem specific to serialization. Here's the code : static void Main(string[] args) { int num = 5000000; Hashtable table = new Hashtable(); for (int i = 0; i < num; ++i) { Object obj = new Object(); table.Add(obj, obj); } }
This application crashes on mono linux but works fine on mono Windows. The bug does not occur if I put Int32 instances in the hashtable instead of Object instances. I'd like to log a bug about it but I wasn't sure where to log it . Should I log it for the "Runtime" component? Jonathan _____ De : [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] De la part de Jonathan Gagnon Envoyé : Friday, May 11, 2007 10:55 AM À : 'Alan McGovern' Cc : Mono-list@lists.ximian.com Objet : Re: [Mono-list] Too many heap sections:IncreaseMAXHINCRorMAX_HEAP_SECTS I changed my code to serialize smaller chunks of data, but I still seem to have some kind of leak that happens only in mono, but I haven't been able to reproduce it in a small test case yet. It seems like the memory I allocate is not always returned to the OS and that after a while, I run out of memory. Is suspect this is a bug inside the GC like you said. Should this bug be a high priority since it seems to mean that mono doesn't handle heavy memory allocations (unless there is something wrong with my code that happens to work well with .NET)? I noticed that there is a new compacting GC under development. Would it be easy for me to test my app with this GC to see if it fixes the problem? Thanks, Jonathan _____ De : Alan McGovern [mailto:[EMAIL PROTECTED] Envoyé : Thursday, May 10, 2007 6:45 PM À : Jonathan Gagnon Cc : Mono-list@lists.ximian.com Objet : Re: [Mono-list] Too many heap sections: IncreaseMAXHINCRorMAX_HEAP_SECTS Ok. So the problem is this (as far as i can make out). You're fragmenting the heap and running out of "free" memory or the GC is just choking on the amount of data you're spitting out. This is caused by two things. 1) The memorystream increasing in size 2) A hashtable internal to mono constantly increasing in size while serialisation is taking place. If you run your testcase with int num = 9000; instead of int num = 15000; it works fine. So the best advice i can offer is to serialise your data in smaller chunks for the moment. Also, i never managed to get an OOM exception when running on MacOS, but the program did seem to crash/hang. Alan. On 5/10/07, Jonathan Gagnon <[EMAIL PROTECTED]> wrote: I simplified the test a little bit. I also tried serializing to a FileStream instead of a MemoryStream and I got the same result. I ran the test on Mono Windows and it runs for a while before exiting with an OutOfMemoryException. Jonathan _____ De : [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] De la part de Jonathan Gagnon Envoyé : Thursday, May 10, 2007 2:53 PM À : 'Alan McGovern'; Mono-list@lists.ximian.com Objet : Re: [Mono-list] Too many heap sections: IncreaseMAXHINCRorMAX_HEAP_SECTS Here is a little test that reproduces the problem. I thought that initializing the memory stream to a size bigger than the entire list would fix the problem but it only makes it happen less quickly in some cases. If you play with the numbers in my little test to reduce the size of allocated memory, you will notice that it takes longer to run out of memory but it still happens after a while. The way it behaves, it really looks like a leak since I have a loop that does the same thing at every run and I would expect the memory allocator to be able to reuse the same memory instead of growing the heap. Note that I compiled the test with VS 2005. I don't know if I could reproduce the bug using the mono compiler. Jonathan _____ De : [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] De la part de Alan McGovern Envoyé : Thursday, May 10, 2007 11:45 AM À : Mono-list@lists.ximian.com Objet : [Mono-list] Too many heap sections: Increase MAXHINCRorMAX_HEAP_SECTS >Also, as a test, could you initialise the memory stream to roughly the size required to store the entire list<T> and see if it works then. That works fine if I do it that way. But my problem is that I can't really know in advance how much memory the serialization will use, so it's not really a viable solution. I'm thinking of trying to split up my list into smaller chunks to see it this could fix the problem by avoiding the large object heap, if there is such a heap in mono. Sounds like your problem is due to heap fragmentation. The only solution is to use a best-guess for the approximate size of the memory stream and initialise the memorystream to that to start off with. For example if the average size of your class is 68 bytes, then initialise the memorystream to array.Length * 68. Or some such thing. Still, a testcase may prove useful. Alan. _______________________________________________ Mono-list maillist - Mono-list@lists.ximian.com http://lists.ximian.com/mailman/listinfo/mono-list
using System; using System.Collections.Generic; using System.Text; using System.IO; using System.Runtime.Serialization.Formatters.Binary; using System.Threading; using System.Collections; namespace TestApp { class Program { static void Main(string[] args) { int num = 5000000; Hashtable table = new Hashtable(); for (int i = 0; i < num; ++i) { Object obj = new Object(); table.Add(obj, obj); } Console.WriteLine("Done"); Thread.Sleep(5000); } } }
_______________________________________________ Mono-list maillist - Mono-list@lists.ximian.com http://lists.ximian.com/mailman/listinfo/mono-list