-------Original Message-------
> From: "Richard Lindsey" <[EMAIL PROTECTED]>
> Subject: Natural behavior of integers
> Sent: 17 Nov 2004 16:57:51
>
> I know this is going to sound like a stupid question, but I'm just
> seeking confirmation of this for my own peace of mind... the "natural"
> behavior of variables in a programming environment (and since it may
> differ from language to language, I'll specifically name C++ for this;
> and by "natural" behavior, I mean how the variable would behave without
> any special handlers written for bounds checking or anything, just
> exactly how the code would be interpreted under normal circumstances)
> is, as far as I've ever observed, that if the value exceeds the upper or
> lower limit of the value range for the declared type, that wraps around
> to the other end of the spectrum... i.e. in a short int variable that
> can hold a range of 65536 values (unsigned), if I were to declare it as
> 60536 and then add 10000 to it, it's current value should equal 5000,
> correct? Or in a signed variable of the same size, ranging from -32767
> to 32767, if I declared it as being equal to 30767 and added 4000 to it,
> it would then equal -30767, right? And if they were initialized to say,
> -30767 and I subtracted 4000 then it should equal 30767?... I can't find
> a definition on the natural behavior of integers anywhere online, and
> since I've never taken an actual C or C++ course, all of my knowledge of
> it is based on observation, so can someone more learned than I confirm
> that? Or if it's incorrect, tell me why, or if it's quirky, point out
> certain instances that this wouldn't be the case? Thanks in advance :)
>
> Richard Lindsey
>
> --
> ROM mailing list
> [EMAIL PROTECTED]
> http://www.rom.org/cgi-bin/mailman/listinfo/rom[2]
-------Original Message-------