I'm trying to make sense of the rules for 'typeof'. It's difficult because DMD's behaviour is so different to the spec. Here's four simple cases.

// This doesn't compile on D1.
//alias typeof(int*int) Alias1;

// This compiles in D1, but not in D2.
alias int Int;
alias typeof(Int*Int) Alias2;

// Yet this DOES compile on D2 !
typeof(T*U) foo(T, U)(T x, U y) { return x*y; }
alias typeof(foo(Int, Int)) Alias3;

// And this fails on both D1 and D2, with a dreadful error message.
//alias typeof(foo(int)) Alias4;

I can't see anything in the spec to say why ANY of these examples should compile. Yet, the existing template constraints features relies on the Alias3 case.

I can see two ways forward:
(1) enforce the existing spec. Make all uses of types as expressions into a bug. This will break a lot of existing code, including several in the DMD test suite! You'd generally need to include a .init whenever using a type inside a typeof(). This would make some code a lot uglier. I'm also not sure what happens with alias parameters. (If A is an alias to a type, then typeof(A*A) should be changed to typeof(A.init*A.init); but if it's an alias to a variable, it should remain as typeof(A*A)).

(2) Define that, inside a typeof() expression, any type T is translated into T.init. The syntax for typeof() would need to be changed, in order to allow the case 'alias1'.

Note, however, that in both cases there's no such thing as .init for tuples; it might need to be added.

Behaviour (2) is probably more convenient, behaviour (1) is easier to justify. But I think the existing behaviour of typeof() doesn't make much sense.

Reply via email to