On Monday, 1 April 2013 at 20:58:00 UTC, Walter Bright wrote:
On 4/1/2013 4:08 AM, Lars T. Kyllingstad wrote:
5. Although a bad practice, destructors in the unwinding
process can also allocate memory, causing double-fault issues.
Why is double fault such a big issue ?
6. Memory allocation happens a lot. This means that very few
function hierarchies could be marked 'nothrow'. This throws a
lot of valuable optimizations under the bus.
Can we have an overview of the optimization that are thrown under
the bus and how much gain you have from them is general ? Actual
data are always better when discussing optimization.
7. With the multiple gigs of memory available these days, if
your program runs out of memory, it's a good sign there is
something seriously wrong with it (such as a persistent memory
leak).
DMD regularly does.