On Mon, Apr 14, 2008 at 1:55 PM, Paul Schlie <[EMAIL PROTECTED]> wrote:
>
>
>  >> (as an aside, as most target implementations treat pointers as unsigned
>  >> values, its not clear that presuming signed integer overflow semantics are
>  >> a reasonable choice for pointer comparison optimization)
>  >
>  > The point is not of presuming signed integer overflow semantics (I was
>  > corrected on this by Ian Taylor). It is of presuming that pointers never
>  > move before the beginning of their object. If you have an array of 20
>  > elements, pointers &a[0] to &a[20] are valid (accessing &a[20] is not 
> valid),
>  > but the compiler can assume that the program does not refer to &a[-2].
>  >
>  > Paolo
>
>  Yes (and in which case if the compiler is smart enough to recognize
>  this it should generate an error, not emit arbitrary [or absents] of
>  code); but the example in question was:
>
>  void f(char *buf)  {
>   unsigned int len = 0xFFFFFF00u; /* or similar */
>
>
>  if (buf+len < buf) puts("true");
>
>  }
>
>  In which case buf is merely a pointer which may point to any char, not a
>  char within a particular array, implying buf+len is also just a pointer,
>  ultimately being compared against buf;
>
>  If all such pointers are presumed to be restricted to pointing to the
>  element they were originally assigned, then all composite pointer arithmetic
>  such as buf+len would be invalid. All this being said, I understand that in
>  general this is an anomalous case, however on small embedded machines with
>  small memory spaces or when writing drivers or memory allocators, such
>  pointer arithmetic may be perfectly legitimate it would seem.

In absence of any declared object (like with this testcase where we just
have an incoming pointer to some unknown object) the compiler can
still assume that any valid object ends at the end of the address space.
Thus, an object either declared or allocated via malloc never "wraps"
around to address zero.  Thus, ptr + int never "overflows".

Richard.

Reply via email to