On Tue, 13 Nov 2018 09:46:17 -0500, Steven Schveighoffer wrote: > Maybe the biggest gripe here is that enums don't prefer their base types > over what their base types convert to. In the developer's mind, the > conversion is: > > A => int => (via VRP) short > > which seems more complex than just > > A => int
It affects explicit casts too: void foo(short a) { writefln("short %s", a); } void foo(int a) { writefln("int %s", a); } foo(cast(int)0); // prints: short 0 In order to force the compiler to choose a particular overload, you either need to assign to a variable or use a struct with alias this. C++, Java, and C# all default to int, even for bare literals that fit into bytes or shorts, and let you use casts to select overloads. C++ has some weird stuff where an enum that doesn't fit into an int is an equal match for all integer types: void foo(unsigned long long); void foo(short); enum A : unsigned long long { a = 2 }; foo(a); // ambiguous! But if you just have an unsigned long long that's not in an enum, it only matches the unsigned long long overload. In C#, if you define multiple implicit casts from a type that match multiple overloads, the compiler prefers the smallest matching type, and it prefers signed over unsigned types. However, for this situation to come up at all, you need to define implicit conversions for multiple numeric types, so it's not directly comparable. Anyway, VRP overload selection hit me yesterday (accepts-invalid sort): I was calling a function `init_color(short, short, short, short)` with a bunch of things that I explicitly casted to int. Tried wrapping it in a function and I discovered the compiler had implicitly casted int to short. Not the end of the world, but I thought a cast would set the type of the expression (instead of just, in this case, truncating floating point numbers).