On 19/03/2017 1:22 AM, Oleg B wrote:
Hello. I found strange behavior while casting enum array and immutable
array.

import std.stdio;

void main()
{
    enum arr = cast(ubyte[])[0,0,0,1,0,0,0,2,0,0,0,3,0,0,0,4];

    auto arr1 = cast(void[])arr;
    immutable arr2 = cast(immutable(void)[])arr;
    enum arr3 = cast(void[])arr;

    writeln(cast(ushort[])arr1); // [0, 256, 0, 512, 0, 768, 0, 1024]
    writeln(cast(ushort[])arr2); // [0, 256, 0, 512, 0, 768, 0, 1024]
    writeln(cast(ushort[])arr3); // [0, 0, 0, 1, 0, 0, 0, 2, 0, 0, 0, 3,
0, 0, 0, 4]
}

I think it's related to representation of enums by compiler as #define.
It's right? It's behavior by design?

It took me a bit but what I thinking happening is 1 and 2 are being casted at runtime where as 3 is at CT. At CT its per value of the enum array, at RT its per x bytes from array. Which sort of makes sense as at CT it does know the type even if you did cast it to void[].

Reply via email to