On Wednesday, 9 October 2013 at 07:33:38 UTC, Manu wrote:
On 9 October 2013 16:05, dennis luehring <dl.so...@gmx.net> wrote:

Am 09.10.2013 07:23, schrieb PauloPinto:

 Apple dropped the GC and went ARC instead, because they never
managed to make it work properly.

It was full of corner cases, and the application could crash if
those cases were not fully taken care of.

Or course the PR message is "We dropped GC because ARC is better"
and not "We dropped GC because we failed".

Now having said this, of course D needs a better GC as the
current one doesn't fulfill the needs of potential users of the
language.


the question is - could ARC be an option for automatic memory managment in D - so that the compiler generated ARC code when not using gc - but
using gc-needed code?

or is that a hard to reach goal due to gc-using+arc-using lib combine
problems?


It sounds pretty easy to reach to me. Compiler generating inc/dec ref calls can't possibly be difficult. An optimisation that simplifies redundant
inc/dec sequences doesn't sound hard either... :/
Is there more to it? Cleaning up circular references I guess... what does
Apple do?
It's an uncommon edge case, so there's gotta be heaps of room for efficient
solutions to that (afaik) one edge case. Are there others?

Apple's compiler does flow analysis.

First all inc/dec operations are generated as usual.

Then flow analysis is applied and all redundant inc/dec are removed before the native code generation takes place.

There is a WWDC session where this was explained.

--
Paulo

Reply via email to