> There should have been a macro taking a parameter N (an integer
> constant expression), where the type would have been valid for any N
> up to the maximum width (thus at least 64). For portability, the
> existence of the macro could have also been tested with #ifdef,
> allowing a fallback.

I don't think this is a good idea at all. A huge majority of "fallback"
code I see in realworld are riddled with bugs - because no one tests
those path - and the more ifdefs you have the faster your software
becomes untestable because each ifdef increases the build combination by
an exponetial factor.

"But how can specifying minimum width lead to bugs"? Integer promotion
rules.

        uint16_t a = UINT16_MAX;
        uint16_t b = 1;
        int c = a + b;   // what is the value of c ??

Is the value of `c` 0 due to unsigned wraparound? Or is it 2^16 because
of integer promotion? You don't know unless you know the rank and width
of `int` for that platform.

Any nontrivial software will contain thousands if not millions of
arithmetic operations. Trying to babysit each arithmetic operations for
"theoretical portability" is not only a massive waste of time, it's very
likely buggy as well unless you can actually test it.

So unless you know for certain that you will be targetting weird
machines and have the means to test on it: keep it simple and make
things *obviously correct* instead of overcomplicating things for
"theoretical portability".

- NRK

Reply via email to