On Monday, 13 October 2014 at 15:21:32 UTC, Daniel N wrote:
On 10/11/2014 7:23 AM, IgorStepanov wrote:
class A
{
int i;
alias i this;
}

class B
{
int i;
alias i this;
}

class C
{
A a;
B b;
alias a this;
alias b this;
}

My preferred solution would be to reject the 2nd alias declaration outright.

I don't see any value in intentionally creating the above pattern, _if_ it occurs then it's most likely due to an unintentional side-effect of a re-factoring, thus it should error out as close as possible to the real error.

This code tell that C is subtype of A and C is subtype of B.
User can use this fact in his code:
void foo(B);

C c = new C;
foo(c); //Ok.
Of course, we shouldn't allow user to cast c to int:
int i = c; //wrong
However, user can explicitly cast c to his subtype, which is convertable to int:
int i = cast(B)c; //Ok
Summarizing, I disagree with suggestion disallow this code at type semantic stage.

Reply via email to