https://gcc.gnu.org/bugzilla/show_bug.cgi?id=123898

--- Comment #4 from GCC Commits <cvs-commit at gcc dot gnu.org> ---
The master branch has been updated by Tamar Christina <[email protected]>:

https://gcc.gnu.org/g:bcdc17211a70f290d46e5d10164c4bc776a89d1a

commit r16-7304-gbcdc17211a70f290d46e5d10164c4bc776a89d1a
Author: Tamar Christina <[email protected]>
Date:   Thu Feb 5 08:07:33 2026 +0000

    middle-end: use inner variable when determining deferred FMA order
[PR123898]

    If we defer an FMA creation the code tries to determine the order of the
    operands before deferring.  To do this it compares the operands against the
    result expression (which should contain the multiplication expression).

    However the multiply might be wrapped in a conversion.  This change has us
strip
    one level of conversion (the most that convert_mult_to_fma) supports
handling
    and only then do the comparison.

    We cannot strip ops[0] and ops[1] and store them stripped since after the
    deferrence, if we create an FMA we need to know the original types and
    convert_mult_to_fma handles the conversions during FMA creation anyway.

    There's probably a similar helper to strip_nop_view_converts but I couldn't
    find one, since many of the stripping helpers are recursive or don't
support
    stripping VIEW_CONVERTS.

    gcc/ChangeLog:

            PR tree-optimization/123898
            * tree-ssa-math-opts.cc (strip_nop_view_converts): New.
            (convert_mult_to_fma): Use it.

    gcc/testsuite/ChangeLog:

            PR tree-optimization/123898
            * gcc.target/aarch64/sve/pr123898.c: New test.

Reply via email to