https://gcc.gnu.org/bugzilla/show_bug.cgi?id=91351

--- Comment #8 from Marc Glisse <glisse at gcc dot gnu.org> ---
(In reply to Martin Liška from comment #7)
> --- a/gcc/tree.c
> +++ b/gcc/tree.c
> @@ -11926,7 +11926,7 @@ int_cst_value (const_tree x)
>  tree
>  signed_or_unsigned_type_for (int unsignedp, tree type)
>  {
> -  if (ANY_INTEGRAL_TYPE_P (type) && TYPE_UNSIGNED (type) == unsignedp)
> +  if (TREE_CODE (type) == INTEGER_TYPE && TYPE_UNSIGNED (type) == unsignedp)
>      return type;

Ah, that's probably the part that makes a difference. Going from
INTEGRAL_TYPE_P to ANY_INTEGRAL_TYPE_P can't have any impact since there are no
complex or vector. But going from == INTEGER_TYPE to INTEGRAL_TYPE_P adds
booleans and enums.

It isn't completely obvious that it is wrong to return the enum type itself if
we are not asking for a sign change, but I guess from likely uses of the
function it would be safer to always return an integer type. So I guess we
could change this ANY_INTEGRAL_TYPE_P (type) to (TREE_CODE (type) ==
INTEGER_TYPE || ((TREE_CODE (type) == COMPLEX_TYPE || VECTOR_TYPE_P (type)) &&
TREE_CODE (TREE_TYPE (type)) == INTEGER_TYPE) or something like that.

I am afraid there are likely other places where we check if a type is unsigned
(or has wrapping overflow) and start doing arithmetic on it that conflicts with
the strict-enum restrictions.

Does the enum really have a precision of 5 bits? I would have expected
(1<<5)-11 instead of 4294967285 (i.e. (1<<32)-11), without looking at it too
closely.

Reply via email to