http://gcc.gnu.org/bugzilla/show_bug.cgi?id=57748

--- Comment #19 from Richard Biener <rguenth at gcc dot gnu.org> ---
Barking up wrong trees.  Hacky fix looks like:

Index: gcc/expr.c
===================================================================
--- gcc/expr.c  (revision 202043)
+++ gcc/expr.c  (working copy)
@@ -4753,6 +4753,9 @@ expand_assignment (tree to, tree from, b
        {
          enum machine_mode address_mode;
          rtx offset_rtx;
+         rtx saved_to_rtx = to_rtx;
+         if (misalignp)
+           to_rtx = mem;

          if (!MEM_P (to_rtx))
            {
@@ -4785,6 +4788,11 @@ expand_assignment (tree to, tree from, b
          to_rtx = offset_address (to_rtx, offset_rtx,
                                   highest_pow2_factor_for_target (to,
                                                                   offset));
+         if (misalignp)
+           {
+             mem = to_rtx;
+             to_rtx = saved_to_rtx;
+           }
        }

       /* No action is needed if the target is not a memory and the field


volatile bitfield case to be audited as well:

      /* If the bitfield is volatile, we want to access it in the
         field's mode, not the computed mode.
         If a MEM has VOIDmode (external with incomplete type),
         use BLKmode for it instead.  */
      if (MEM_P (to_rtx))
        {
          if (volatilep && flag_strict_volatile_bitfields > 0)
            to_rtx = adjust_address (to_rtx, mode1, 0);
          else if (GET_MODE (to_rtx) == VOIDmode)
            to_rtx = adjust_address (to_rtx, BLKmode, 0);
        }

checks the wrong RTX if it got the movmisalign path.  Or rather,
-fstrict-volatile bitfields doesn't seem to work properly for
misaligned accesses?

Reply via email to