On Tue, Oct 20, 2009 at 5:40 PM, tallpaul2000 <[email protected]> wrote:
> range for int and an unsigned int MUST be at least [-32767,+32767] and 
> [0,65535], respectively.
> This means they are minimum 16 bits wide.

But can be wider on any particular implementation.

> Similarly, looking at the spec for LONG_MIN, LONG_MAX, and ULONG_MAX, the 
> standard says
> that a long and unsigned long MUST be minimum 32 bits wide. The standard 
> makes no claim
> that they can't be wider, but seems _very_ clear on the minimum widths.
>
> However, when I look at <stdint.h>,

... on a specific implementation, where an int is 32 bits wide...

> I see the following typedefs:
> ISO C99: 7.18 <stdint.h>
>        typedef unsigned int uint32_t;
>        typedef int int32_t;
>
> This has me scratching my head, because on platforms where an int is 16 bits 
> wide
> you have a type called "int32_t" that contrary to its name is 16 bits wide!

No, you have an integer _type_ that is 32 bits wide.

> It seems to me the typedefs in <stdint.h> should be:
>        typedef unsigned long uint32_t;
>        typedef long int32_t;

They probably are on an implementation that has 16 bit wide ints.

> So then by the spec we are guaranteed that an int32_t will be at least 32 
> bits wide.

But, an int32_t is supposed to be an int that _is_ 32 bits wide, not _at_least_.

You appear to be confusing char, int, long etc that have minimum
widths with the intxx_t that have exact widths.

> This seems to be a pretty serious error in terms of portability.

Not at all. In fact quite the opposite.

If you want a type that's _at_least_ 16 bits wide, use an int. If you
want an int that *is* 16 bits wise, use int16_t.

> But given these typedefs have been in <stdint.h> for a long time they must be 
> OK for some reason I can't see.
>
> Can somebody please shed some light on this for me?

-- 
PJH

http://shabbleland.myminicity.com/
http://www.chavgangs.com/register.php?referer=9375
http://www.kongregate.com/?referrer=Shabble

Reply via email to