On Saturday, 11 May 2013 at 18:15:22 UTC, Daniel Murphy wrote:
If we use them in the compiler, we effectively freeze them. We can't use
the new modules, because the old toolchains don't have them.

Fair enough, but in such a case we could always add the parts of them we really need to the compiler source until the module is present in the last supported version. The critical difference of this scenario to your approach is that the extra maintenance burden is limited in time: The code is guaranteed to be removed again after (say) a year, and as Phobos stabilizes more and more, the total amount of such "compatibility" code will go down as well.

We can't fix
old broken modules because the compiler depends on them.

I don't see your point here:
1) The same is true for any client code out there. The only difference is that we now directly experience what any D library writer out there has to go through anyway, if they want their code to work with multiple compiler releases. 2) If a module is so broken that any "fix" would break all client code, we probably are not going to use it in the compiler anyway.

If you add code to
work around old modules being gone in later versions, you pretty much end up
moving the source into the compiler after all.

Yes, but how often do you think this will happen? At the current point, the barrier for such changes should be quite high anyway. The amount of D2 code in the wild is already non-negligible and growing steadily.

If we only need to be able to compile with a version from 6 months ago, this is not a problem. A year and it's still workable. But two years? Three?
We can get something right here that gcc got so horribly wrong.

Care to elaborate on that?

David

Reply via email to