On Fri, Mar 04, 2016 at 02:03:14PM +1300, Evan Hanson wrote: > This allows the client to consider the stream as a written u8vector, > without needing to mask off the extra bits of any elements outside the > byte range. > --- > dbg-stub.c | 2 +- > 1 file changed, 1 insertion(+), 1 deletion(-) > > diff --git a/dbg-stub.c b/dbg-stub.c > index 96227de..bbfa3c5 100644 > --- a/dbg-stub.c > +++ b/dbg-stub.c > @@ -381,7 +381,7 @@ send_event(int event, C_char *loc, C_char *val, C_char > *cloc, int cln) > send_string(rw_buffer); > > for(n = 0; n < reply; ++n) { > - sprintf(rw_buffer, " %lu", (unsigned long)((char > *)C_data_pointer(x))[ n ]); > + sprintf(rw_buffer, " %u", ((unsigned char *)C_data_pointer(x))[ n > ]);
Shouldn't this be printed as a character? + sprintf(rw_buffer, " %c", ((unsigned char *)C_data_pointer(x))[ n ]); Does it even matter? Actually, now I'm *really* confused by C's vararg handling and printf. The manual (and the C11 draft standard) says: c If no 'l' length modifier is present, the int argument is converted to an unsigned char, and the resulting character is written. Does this mean an unsigned char argument to a varags function is always widened to "int"? How else can it know how much to take off the stack? I can't find anything about this in the spec, and while it makes sense in practice if you consider alignment of the rest of the arguments, I don't really see how this follows from the spec. And if it really is the case that characters are always widened to integers, why bother to have a %c format specifier in the first place? Cheers, Peter
signature.asc
Description: Digital signature
_______________________________________________ Chicken-hackers mailing list Chicken-hackers@nongnu.org https://lists.nongnu.org/mailman/listinfo/chicken-hackers