On 10 April 2013 21:09, Regan Heath <re...@netmail.co.nz> wrote:

> On Wed, 10 Apr 2013 11:59:32 +0100, Dicebot <m.stras...@gmail.com> wrote:
>
>  On Wednesday, 10 April 2013 at 10:53:26 UTC, Regan Heath wrote:
>>
>>> Hmm..
>>>
>>>  A is not final.
>>>>
>>>
>>> True.  But, I don't see how this matters.
>>>
>>>  A has no internal linkage. It can be inherited from in other
>>>> compilation unit.
>>>>
>>>
>>> False.  In this first example we are compiling A and B together (into an
>>> exe - I left that off) so the compiler has all sources and all uses of all
>>> methods of A (and B).
>>>
>>>  notVirt is virtual.
>>>>
>>>
>>> It may actually be (I don't know) but it certainly does not have to be
>>> (compiler has all sources/uses) and my impression was that it /should/ not
>>> be.
>>>
>>> R
>>>
>>
>> If it is compiled all at once and compiled into executable binary than
>> yes, you examples are valid and compiler _MAY_ omit virtual.
>>
>
> Exactly the point I was trying to make.  I wanted to establish the point
> at which the design problems (what D defines/intends to do) arise, vs when
> the implementation problems arise (DMD not doing what D intends).
>
>
>  But
>> a) DMD doesn't do it as far as I am aware.
>>
>
> Maybe, maybe not.  I have no idea.  My understanding of the design
> decision is that DMD will eventually do it.


I feel like I'm being ignored. It's NOT POSSIBLE.

 b) It is a quite uncommon and restrictive build setup.
>>
>
> Maybe at present.
>
> Lets assume DMD can remove virtual when presented with all sources
> compiled in one-shot.
>
> Lets assume it cannot if each source is compiled separately.  Is that an
> insurmountable problem?  A design problem?  Or, is it simply an
> implementation issue.  Could an obj file format be designed to allow DMD to
> perform the same optimisation in this case, as in the one-shot case.  My
> impression is that this should be solvable.
>
> So, that just leaves the library problem.  Is this also insurmountable?  A
> design problem?  Or, is it again an implementation issue.  Can D not mark
> exported library methods as virtual/non-virtual?  When user code derives
> from said exported class could D not perform the same optimisation for that
> class?  I don't know enough about compilation to answer that.  But, I can
> see how if the library itself manifests an object of type A - which may
> actually be an internal derived sub-class of A, there are clearly issues.
>  But, could DMD not have two separate definitions for A, and use one for
> objects manifested from the library, and another locally for user derived
> classes?  I don't know, these are all just ideas I have on the subject.  :p


That sounds overly complex and error prone. I don't believe the source of
an object is actually trackable in the way required to do that.
I can't conceive any solution of this sort is viable. And even if it were
theoretically possible, when can we expect to see it implemented?
It's not feasible. The _problem_ is that functions are virtual by default.
It's a trivial problem to solve, however it's a major breaking change, so
it will never happen.
Hence my passing comment that spawned this whole thread, I see it as the
single biggest critical mistake in D, and I'm certain it will never be
changed. I've made my peace, however disappointing it is to me.

Reply via email to