Issac Goldstand wrote:
Jim Jagielski wrote:
On Thu, May 01, 2008 at 06:52:58PM -0500, William A. Rowe, Jr. wrote:
Lucian Adrian Grijincu wrote:
On Fri, May 2, 2008 at 2:18 AM, Roy T. Fielding <[EMAIL PROTECTED]>
wrote:
Why? The type char is defined by the C standard to be an 8bit
signed integer.
The type unsigned char is defined to be an 8bit unsigned integer.
Why would
we want to add a bunch of unnecessary casting?
Not quite: http://home.att.net/~jackklein/c/inttypes.html
That doesn't resolve Roy's question of "why overload signed char and
unsigned char"?
Can anyone point to a platform where int8_t/uint8_t !=
signed/unsigned char?
If so, I agree with the patch.
I must have misunderstood the orig request... I thought it was simply
creating int8_t/uint8_t to compliment the existing int*_t/uint*_t types
That's what I read too, and would answer Roy's question as "uniform
syntax/readability". Not a strong reason, but certainly no harm that
I can see in it...
Apologies, with the amount of traffic this generated, this request must
look like I'm trolling!
The reason for wanting the (u)int8 types was primarily for readabilty,
i.e. to distinguish between whether you are manipulating character data
or numerical data. Moreover, there are times where you specifically
require an 8 bit uint, i.e. 255 + 1 == 0 etc.
If using (u)int8_t is going to break compatibilty on otherwise
compatible platforms then using it really isn't a good idea. As in
practice CHAR_BIT == 8 on almost all platforms, would it be acceptable
to typedef apr_(u)int8_t as (un)signed char only if CHAR_BIT == 8, and
leave them undefined otherwise. The range of supported platforms could
be further extended by checking for a C99 implementation and then using
(u)int8_t's if available, before then checking CHAR_BIT == 8 and falling
back to (un)signed char, or undefined. I'm not sure if this is
acceptable according to the APR standards, but would work pretty well.
It should only break programs that specifically require 8 bit ints on
platforms that don't provide them, and would give an early, noisy
failure rather than allowing the program to compile, but then not behave
correctly. If CHAR_BIT defined on all platforms that APR supports?
I could of course do this myself in my own headers. However,
readability would not be helped by having my_uint8_t's mixed with
apr_uint16_t's, and declaring apr_uint8_t myself seems like pretty bad
practice.
Regards,
Chris