At least on OSX, it appears that mapping memory is constant time regardless of size, but there is some max total memory I'm allowed to map, presumably based on the size of a vmm lookup tabe. The max block size I can allocate is 1 GB, and I can allocate roughly 131,000 of these blocks before getting an out of memory error. If I reduce the block size to 4 MB I can allocate more than 10M blocks without error. I think some default stack size around 4 MB seems about right. Increasing the size to 16 MB failed after about 800,000 allocations, which isn't enough (potential) fibers.

Reply via email to