On 6/3/13 10:51 PM, Jonathan M Davis wrote:
On Monday, June 03, 2013 22:25:13 Andrei Alexandrescu wrote:
It's useless to focus on the breakage override has caused. Yes it did
cause breakage. There is no conclusion to draw from that without
considering the considerably complex dynamics surrounding the whole
matter (experience, benefits, number of users affected positively and
negatively).

To use that breakage as an argument linked to absorbing breakage caused
by switching to final-by-default does not make sense. I'll try to
abstain replying to this particular point in the future, it just
instantly lowers the quality of the dialog.

The comparison is made because we're talking about a related change, and the
actual breakage caused by override was fairly recent, so while the decision to
cause that breakage was made quite some time ago, we were still willing to
cause that breakage fairly recently, so the implication then is that it would
be acceptable to do something related which causes less breakage.

This nice one-long-sentence paragraph does little in the way of helping because it just restates the same well-understood matter that I disagree with without adding information.

My argument is that the relatedness of the change is tenuous, and explains why I think so. The paragraph above presupposes again relatedness, and proceeds with a tedious re-explanation of the consequences of that assumption.

We don't make progress like this:

Speaker A: "XYZ, therefore ABC".

Speaker B: "I disagree with XYZ because TUV."

Speaker A: "But since XYZ then ABC."

Now, that being said, I do think that we need to look at this change in its
own right, and it needs to justify itself, but clearly folks like Manu think
that it does justify itself and feel that there's something off if we're
willing to make the change with override and not this one, when they're
related, and this one causes even less breakage.

The matters of override vs virtual/final are not related so analyzing the consequences of said relatedness is not very productive.

That doesn't make one necessarily more important than the other, but it must be understood that they are distinct matters, and we can't compare one comma in one with one comma in the other.

Requiring "override" in overriding methods:

1. Protects against categories of bugs that otherwise would be impossible to protect against: accidental overriding and accidental non-overriding.

2. Provides an important maintenance tool for code evolution, statically breaking code that would otherwise change semantics silently.

It's important that without "override" there is virtually no protection against these issues. We're talking about an "all goodness" feature, and this kind of stuff is in very low supply in this world.

Choosing "virtual" by default:

1. Fosters flexibility by allowing derived classes to override unannotated methods in base classes.

2. Is suboptimal in speed because users pay for the potential flexibility, even when that flexibility is not actually realized (barring a static analysis called class hierarchy analysis).

3. Ultimately lets the programmer choose the right design by using annotations appropriately.

Choosing "final" by default:

1. Fosters speed by statically binding calls to unannotated methods.

2. Is suboptimal in flexibility because users pay for the speed with loss of flexibility, even when speed is not a concern but flexibility is.

3. Ultimately lets the programmer choose the right design by using annotations appropriately.

The introduction of "override" allows a language to choose either final or virtual by default, without being exposed to potential bugs. This is pretty much the entire extent to which "override" is related to the choice of virtual vs. final by default.

Today, D makes it remarkably easy to choose the right design without significant boilerplate, regardless of the default choice:

- "struct" introduces a monomorphic type with a limited form of subtyping (via alias this) and no dynamic binding of methods.

- "final class" introduces a leaf class that statically disallows inheritance and consequently forces static calls to all methods. (BTW I recall there were some unnecessary virtual calls for final classes, has that been fixed?)

- "final { ... }" introduces a pseudo-scope in which all declared methods are final

- "final:" introduces a pseudo-label after which all declared methods are final

(Granted, there's an asymmetry - there's no "~final:" label to end final, which makes it marginally more tedious to arrange final and non-final methods in the class.)

This leaves an arguably small subset of designs, scenarios, projects, and teams that would be affected by the choice of default. When that point has been made, it has been glibly neglected with an argument along the lines of "yeah, well programmers will take the path of least resistance, not really think things through, come from C++ and assume the wrong default", which may as well be true for a subset of situations, but further reduces the persona typically affected by the choice of default.

I personally have a hard time picturing someone who is at the same time obsessed with performance, disinclined to assess it, unwilling to learn how to improve it, and incapable of using simple tools to control it. Yet this persona is put at the center of the argument that we must change the default right now seeing as it is a huge problem. To top it off, the entire fallacy about override causing more breakage is brought about. Yes, smoking kills, and the fact that cars kill more people doesn't quite have a bearing on that.

I do think that virtual-by-default was a mistake and would like to see it
fixed, but I'm also not as passionate about it as Manu or Don.

Choosing virtual (or not) by default may be dubbed a mistake only in a context. With the notable exception of C#, modern languages aim for flexibility and then do their best to obtain performance. In the context of D in particular, there are arguments for the default going either way. If I were designing D from scratch it may even make sense to e.g. force a choice while offering no default whatsoever.

But bottom line is, choosing the default is not a big deal for D because this wonderful language offers so many great building blocks for any design one might imagine.

Manu in
particular seems to be sick of having to fix performance bugs at Remedy Games
caused by this issue and so would really like to see non-virtual be the
default. The folks using D in companies in real-world code seem to think that
the ROI on this change is well worth it.

I'm wary/weary of polls with a small number of participants. Let's also not forget that these people do use D successfully, and if sticking "final" here and there is the most difficult endeavor that has helped performance of their programs, I'd say both them and D are in great shape.

And a technical issue which affects us all is how this interacts with
extern(C++). Daniel Murphy is having to improve extern(C++) in order to be
able to port the dmd frontend to D (so that it can properly interact with the
backends), and the fact that member functions are virtual by default definitely
causes problems there. He would know the details about that better than I
would, but IIRC, it had to do with the fact that we needed to be able to
interface with non-virtual member C++ functions. So, depending on the details
with that, that alone could make it worth switching to non-virtual by default,
particularly when the breakage is actually quite loud and easy to fix.

I don't know much about that matter, as I don't know about the argument related to mock injection and such, so I won't comment on this.

Finally, I'll note that I'd started a reply to this remark by Manu (who in turn replied to David):

Is there a reason this change offends you enough to call me names? Or
can you at least tell how I'm being narrow-minded?

I deleted that reply, but let me say this. In a good argument:

1. Participants have opinions and beliefs derived from evidence they have accumulated.

2. The very ongoing discourse offers additional evidence to all participants by means of exchange of information. This is to be expected because participants have varied backgrounds and often it's possible to assess how competent they are.

3. The merits of various arguments are discussed, appreciated, and integrated within the opinions and beliefs of the participants.

4. A conclusion is reached in light of everything discussed and everybody is richer that way.

In a not-so good argument:

1. Participants start each from an immutable belief.

2. Their preoccupation is to amass, bend, or fabricate any argument that would make that belief prevail, and to neglect any argument to the contrary.

3. The entire discussion has a foregone conclusion for everyone involved, i.e. nobody changes opinions and nobody is gained.

The attitude "I know what's right, the only problem is to make you understand" doesn't serve anyone, because it locks "me" in a trench with no horizon and no mobility, and elicits an emotional response in "you".

Here we don't want to keep "virtual" default and we don't want to make "final" default. We want to do what's right. So the discussion should progress toward finding what's right, not starting from knowing what's right and working arguments from there.


Andrei

Reply via email to