On 6/6/2013 3:12 PM, Jonathan M Davis wrote:
On Thursday, June 06, 2013 14:57:00 Walter Bright wrote:
On 6/6/2013 2:23 PM, Andrei Alexandrescu wrote:
(The tool I'm envisioning
would add final annotations or prompt the user to add them.)

Sorry, that's never going to fly.

It could tell the programmer which functions it _thinks_ don't need to be
virtual, but it can't be 100% correct. So, it would effectively be a lint-like
tool targeting possible devirtualization opportunities. It would actually be
potentially useful regardless of whether virtual or non-virtual is the
default, since programmers may have needlessly marked functions as virtual.
But if it's a question of whether it's a good solution for optimizing away
virtuality instead of making functions non-virtual, then I don't think that it
would fly - not if optimization is a prime concern. It would just be a nice
helper tool for static analysis which could give you suggestions on things you
might be able to improve in your program.

I know. But people are never going to use that tool.


But as it sounds like the primary argument which has swayed you towards making
non-virtual the default is tied to cleaner code evolution and maintenance
rather than performance, the suggestion obviously wouldn't be a viable
counterargument for going with virtual-by-default.

The thing is, when code 'works' there is rarely sufficient motivation to go back and annotate things for safety and performance (hence why the tool above will be a failure). Code that works is left alone, and we see the situation Manu is talking about.

But if it's final by default, if the user needs it to be virtual, then he has to go back and add the annotation - it's not going to work, and the compiler will tell him it doesn't work.

I wouldn't have changed my mind if it were possible for the compiler to auto-finalize methods.

BTW, this is also why D hasn't opted for the pointer tagging system Rust has. It all looks great on paper, but I suspect that in practice not much use will be made of it - people will just default to using the most widely usable pointer type and their code will work and they'll forget about the rest of the annotations.

I have a lot of experience with this with DOS 16 bit code. There we had all kinds of pointer types - near, far, SS relative, CS relative, etc. You know what people did? The default pointer type. Almost nobody used those optimized pointer types, even though they got big speed boosts when used appropriately.

What does work is throw -O and have the compiler go to town and optimize the hell out of it. As much as possible we should be selecting default semantics that enable -O to kick ass.

Reply via email to