On Thursday, 17 April 2014 at 08:52:28 UTC, Ola Fosheim Grøstad wrote:
On Thursday, 17 April 2014 at 08:22:32 UTC, Paulo Pinto wrote:
Of course it was sold at WWDC as "ARC is better than GC" and not as "ARC is better than the crappy GC implementation we have done".

I have never seen a single instance of a GC based system doing anything smooth in the realm of audio/visual real time performance without being backed by a non-GC engine.

You can get decent performance from GC backed languages on the higher level constructs on top of a low level engine. IMHO the same goes for ARC. ARC is a bit more predictable than GC. GC is a bit more convenient and less predictable.

I think D has something to learn from this:

1. Support for manual memory management is important for low level engines.

2. Support for automatic memory management is important for high level code on top of that.

The D community is torn because there is some idea that libraries should assume point 2 above and then be retrofitted to point 1. I am not sure if that will work out.

Maybe it is better to just say that structs are bound to manual memory management and classes are bound to automatic memory management.

Use structs for low level stuff with manual memory management.
Use classes for high level stuff with automatic memory management.

Then add language support for "union-based inheritance" in structs with a special construct for programmer-specified subtype identification.

That is at least conceptually easy to grasp and the type system can more easily safeguard code than in a mixed model.

Most successful frameworks that allow high-level programming have two layers:
- Python/heavy duty c libraries
- Javascript/browser engine
- Objective-C/C and Cocoa / Core Foundation
- ActionScript / c engine

etc

I personally favour the more integrated approach that D appears to be aiming for, but I am somehow starting to feel that for most programmers that model is going to be difficult to grasp in real projects, conceptually. Because they don't really want the low level stuff. And they don't want to have their high level code bastardized by low level requirements.

As far as I am concerned D could just focus on the structs and the low level stuff, and then later try to work in the high level stuff. There is no efficient GC in sight and the language has not been designed for it either.

ARC with whole-program optimization fits better into the low-level paradigm than GC. So if you start from low-level programming and work your way up to high-level programming then ARC is a better fit.

Ola.

Looking at the hardware specifications of usable desktop OSs built with automatic memory managed system programming languages, we have:

Interlisp, Mesa/Cedar, ARC with GC for cycle collection, running on Xerox 1132 (Dorado) and Xerox 1108 (Dandelion).

http://archive.computerhistory.org/resources/access/text/2010/06/102660634-05-05-acc.pdf

Oberon running on Ceres,

ftp://ftp.inf.ethz.ch/pub/publications/tech-reports/1xx/070.pdf

Bluebottle, Oberon's sucessor has a primitive video editor,
http://www.ocp.inf.ethz.ch/wiki/Documentation/WindowManager?action=download&upname=AosScreenshot1.jpg

Spin running on DEC Alpha, http://en.wikipedia.org/wiki/DEC_Alpha

Any iOS device runs circles around those systems, hence why I always like to make clear it was Apple's failure to make a workable GC in a C based language and not the virtues of pure ARC over pure GC.

Their solution has its merits, and as I mentioned the benefit of generating the same code, while releasing the developer of pain to write those retain/release themselves.

Similar approach was taken by Microsoft with their C++/CX and COM integration.

So any pure GC basher now uses Apple's example, with a high probability of not knowing the technical issues why it came to be like that.

--
Paulo

Reply via email to