On Saturday, 20 May 2017 at 03:54:43 UTC, Jonathan M Davis wrote:

Because of the issue of lifetimes, some language features simply cannot be implemented without the GC, and I think don't see any point in trying to make it so that you can use all features of D without the GC. That simply won't work. By the very nature of the language, completely avoiding the GC means completely avoiding some features.

Even with the GC we have guns to shoot us in the foot with, which are .destroy() and GC.free(). The GC itself is not an issue at all, it's the lack of choice in the language that is the problem. @nogc attribute alone is not enough.

D's dynamic arrays fundamentally require the GC, because they do not manage their own memory. They're just a pointer and a length and literally do not care what memory backs them. As long as all you're doing is slicing them and passing them around (i.e. restrict yourself to the range-based functions), then the GC is not involved, and doesn't need to be, but as soon as you concatenate or append, the GC has to be involved. For that not to be the case, dynamic arrays would have to manage their own memory (e.g. be ref-counted), which means that they could not be what they are now. A different data structure would be required.

That is not necessary. See my previous comment. We can amend the type system so it understands when it can't use the GC.

// Syntax is temporary, for illustration purposes. It is currently ambiguous with the language

int[] (nogc) myArray;


auto a = myArray ~ [1,2,3]; // error, cannot concatenate nogc and gc arrays.
auto b = myArray ~ [1,2,3] (myAllocator);
// implies compiler-generated:
// auto block = myAllocator.reallocate(myArray, (myArray.length + 3)*int.sizeof);
// handle out-of-memory, etc...
// int[] (nogc) result = (cast(int[])block.ptr)[0..myArray.length+3];
// return result;

Yes, verbose, and yes, ugly. Manual memory management is that. But just flat-out forbidding users to use certain features is no less verbose and no less ugly.

Similarly, stuff like closures require the GC. They need something to manage their memory. They're designed to be automatic.

They were designed long ago, perhaps that design needs revisiting. They absolutely do not *have* to manage their memory. It is convenient when they do and very pleasant when working with GC. But that makes them a niche feature at best. If the user is given a little bit more control over captures, we'd get more cases when allocation is not needed. If the user is given control of the allocation itself, even better, as it gives them back a feature taken away by the GC.

Explicit (nogc) requirement can be devised for the closures too, we just need to put effort into that instead of silently ignoring it.

Honestly, I think that this push for @nogc and manual memory mangement is toxic. Yes, we should strive to not require the GC where reasonable, but some things simply are going to require the GC to work well, and avoiding the GC very quickly gives you a lot of the problems that you have with languages like C and C++.

For instance, at dconf, Atila talked about the D wrapper for excel that he wrote. He decided to use @nogc and std.exception.allocator, and not only did that make it much harder for him to come up with a good, workable design, it meant that he suddenly had to deal with memory corruption bugs that you simply never have with the GC. He felt like he was stuck programming in C++ again - only worse, because he had issues with valgrind that made it so that he couldn't effectively use it to locate his memory corruption problems.

That is *mostly* due to the lack of facilities in the language *and* standard library. It's not written with manual memory management in mind and so does not provide any ready-made primitives for that, which means you have to write your own, which means you will have bugs. At least more bugs than you would've had have you had the help.

The GC makes it far easier to write clean, memory-safe code. It is a _huge_ boon for us to have the GC. Yes, there are cases where you can't afford to use the GC, or you have to limit its use in order for your code to be as performant as it needs to be, but that's the exception, not the norm. And avoiding the GC comes at a real cost.

All is true except the last sentence. The cost should not be huge, but for that the language has to work with us. Explicitly, without any "special cases".

And the reality of the matter is that using the GC has real benefits, and trying to avoid it comes at a real cost, much as a number of C++ progammers want to complain and deride as soon as they hear that D has a GC. And honestly, even having @nogc all over the place won't make many of them happy, because the GC is still in the language.

If people simply want to assume the GC is bad and turn away, let them. The community or the language won't suffer from it. OTOH, for people who do have legitimate nogc use cases, we should strive to keep as much language facilities as possible.

Reply via email to