As part of bridging a set of bit-masks in a C API, I declared an enum and a set:
type
Flag = enum
A = 4
B = 6
# ... more flags...
Flags = set[Flag]
Run
After about half an hour of debugging an mysterious EINVAL return value from
the C API 😖 I finally discovered that **the set value have the wrong integer
values** :
check cast[int]({A}) == 16
check cast[int]({B}) == 64
Run
The checks fail — turns out `{A}` is 1 and `{B}` is 4! 🤯
I figured out I can work around this by adding a fake enum item whose value is
0. So it appears that Nim's `set` always assigns bits relative to the lowest
enum value, instead of using the enum values as absolute bit positions.
Is this a bug, or intentional? If the latter, is there a cleaner workaround
like a pragma?
—Jens
PS: Don't ask me why this API doesn't use bits 0-3! I didn't write it. I
suspect there are some private flags, or maybe there used to be other flags but
they are obsolete.