On Sat, Aug 23, 2008 at 9:40 PM, Schuh, Richard <[EMAIL PROTECTED]> wrote: > Let's say that total amount of virtual storage is the main issue, with > everything else relegated to the status of being inconsequential.
The limit is on "active virtual memory" in the system. As Bill Holder says, it's determined by access density (or fragmentation, whatever you want to call it). There is two orders of magnitude difference between best case and worst case. And we found a system starting a bunch of big Linux servers to be close to the worst case. 100 GB real seemed to be a practical maximum in that situation. YMMV -Rob