Alright, so I made a fool of myself in my last post by not looking for the class hierarchy of TypeInfo in object.d. But now I've found the cause of the inconsistent hash problem.
For an int[], the typeid is TypeInfo_Ai. Here's the implementation of TypeInfo_Ai.getHash(): override hash_t getHash(in void* p) { int[] s = *cast(int[]*)p; return hashOf(s.ptr, s.length * int.sizeof); } For a const int[], the typeid is TypeInfo_Const -> TypeInfo_Array -> TypeInfo_i. TypeInfo_Const.getHash() simply forwards the call to TypeInfo_Array.getHash(), which is implemented thus: override hash_t getHash(in void* p) @trusted { void[] a = *cast(void[]*)p; return hashOf(a.ptr, a.length); } The problem is that a.length is expressed in terms of units of array elements, NOT bytes. However, hashOf() is being called with the unscaled array length, so for all arrays with elements larger than a byte, the hash is being computed only on the first (a.length) *bytes* rather than (a.length) *array elements*. This bug can be shown by the following code: const int[] a1 = [1,2,3,4]; const int[] a2 = [1,3,4,5]; const int[] a3 = [1,4,5,6]; auto t1 = typeid(a1); auto t2 = typeid(a2); auto t3 = typeid(a3); assert(t1.getHash(&a1) == t2.getHash(&a2)); assert(t2.getHash(&a2) == t3.getHash(&a3)); This is because the hash is being calculated on only the first 4 bytes of the arrays (because a1.length==4 is being misinterpreted as 4 bytes), which is the first int, so none of the trailing elements are included in the hash. This is why there's a difference between the hashes computed for int[] vs. const(int)[] and immutable(int)[]. And this is why the problem doesn't happen with char[] and string (==immutable(char)[]) -- the size of char happens to be 1 byte, so the bug doesn't show itself. T -- Latin's a dead language, as dead as can be; it killed off all the Romans, and now it's killing me! -- Schoolboy