https://gcc.gnu.org/bugzilla/show_bug.cgi?id=98556

--- Comment #11 from CVS Commits <cvs-commit at gcc dot gnu.org> ---
The releases/gcc-8 branch has been updated by Jakub Jelinek
<ja...@gcc.gnu.org>:

https://gcc.gnu.org/g:34274b6f705df96c214455c9a3831842945a43a0

commit r8-10873-g34274b6f705df96c214455c9a3831842945a43a0
Author: Jakub Jelinek <ja...@redhat.com>
Date:   Sat Jan 9 10:49:38 2021 +0100

    tree-cfg: Allow enum types as result of POINTER_DIFF_EXPR [PR98556]

    As conversions between signed integers and signed enums with the same
    precision are useless in GIMPLE, it seems strange that we require that
    POINTER_DIFF_EXPR result must be INTEGER_TYPE.

    If we really wanted to require that, we'd need to change the gimplifier
    to ensure that, which it isn't the case on the following testcase.
    What is going on during the gimplification is that when we have the
    (enum T) (p - q) cast, it is stripped through
          /* Strip away as many useless type conversions as possible
             at the toplevel.  */
          STRIP_USELESS_TYPE_CONVERSION (*expr_p);
    and when the MODIFY_EXPR is gimplified, the *to_p has enum T type,
    while *from_p has intptr_t type and as there is no conversion in between,
    we just create GIMPLE_ASSIGN from that.

    2021-01-09  Jakub Jelinek  <ja...@redhat.com>

            PR c++/98556
            * tree-cfg.c (verify_gimple_assign_binary): Allow lhs of
            POINTER_DIFF_EXPR to be any integral type.

            * c-c++-common/pr98556.c: New test.

    (cherry picked from commit 0188eab844eacda5edc6257771edb771844ae069)

Reply via email to