Simon Marlow <[EMAIL PROTECTED]> writes:

> John Meacham wrote:

>> perhaps if -M is not otherwise set, 'getrlimit(RLIMIT_AS,..)' could be
>> called and the maximum heap size set to just under that

Of course, it is commonly set to 'unlimited' anyway.  Perhaps I should
limit it; OTOH, the value must be less than 2Gb (signed int), which
will soon be on the small side for a modern workstation.

For my programs, I've found that setting -M to 80% of physical tends
to work well.  Beyond that, I get thrashing and lousy performance.
(Perhaps programs mmap'ing large files etc can work well beyond
physical memory?  I'd be interested to hear others' experiences.)

Quite often, I find the program will run equally well with smaller
heap (presumably GC'ing harder?).  I think it would be a good default
to at least try as hard as possible to keep heap smaller than physical
RAM. 

(Caveat: I'm on a Linux system which doesn't work wery well with heap
sizes at the moment, so my observations may not apply.)

-k
-- 
If I haven't seen further, it is by standing in the footprints of giants

_______________________________________________
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users

Reply via email to