Richard Guenther wrote:
On 2/21/06, Richard Kenner <[EMAIL PROTECTED]> wrote:
     But if the values in there do not reflect the reality of what values
     are valid for the type, then I don't see how they can be generally
     useful -- that's my point.  We have two fields that are inaccurate,
     apparently on purpose, and as a result they are basically unusable.

No, they *do* reflect the "reality of what values are valid for the type".
The only glitch, which is not what we're talking about here, is that you have
to have a way to implement the language-defined test to see if the value is
valid or not.  However, the need to implement that relative-uncommon test
should't drive the basic methodology used to represent types.

As you mention in another post an invalid value can only occour (as in, being
not undefined behavior) with an unchecked conversion.

The use of the term invalid is confusing in connection
with Ada. Ada has two concepts of out of range values

  Invalid values. Caused most notably by uninitialized
  variables. Use of such values is generally not
  erroneous, but rather a bounded error with a small
  subset of reasonable outcomes (program termination,
  exception, or just use the value, but such use cannot
  cause erroneous behavior).

  Abnormal values, Caused e.g. by two tasks messing with
  a variable at the same time. Any use of an abnormal
  variable is erroneous.

The result of a bogus unchecked_conversion is impl defined.
For GNAT, in the case of discrete types, we say that such a
result is invalid rather than abnormal.

'Valid does not have to work for abnormal values, it must
work correctly for invalid values (in practice it will work
as expected for abnormal values).

  The Ada frontend
has to make sure then that for

  BaseType i;
  SubType x = <unchecked>(SubType)i;
// now the value stored in x may exceed its range
  if (Valid (x))
   ...

that for the Valid (x) test, a VIEW_CONVERT_EXPR is used to do the comparison(s)
in the BaseType again.  And using the VIEW_CONVERT_EXPR to tell the compiler
it cannot look through the cast and infer range information from the type of x.

Sounds exactly right to me.

Richard.


Reply via email to