https://gcc.gnu.org/bugzilla/show_bug.cgi?id=98594
--- Comment #3 from Jan Hubicka <hubicka at gcc dot gnu.org> --- The initialization is removed by dse1 pass. We get: ipa-modref: call stmt D.3199 = bitCount::bitCount_bitfield<1, int, glm::packed_highp> (&D.3185); [return slot optimization] ipa-modref: call to glm::vec<L, int, Q> bitCount::bitCount_bitfield(const glm::vec<L, T, Q>&) [with int L = 1; T = int; glm::qualifier Q = glm::packed_highp]/8 does not use ref: D.3185.D.3097.x alias sets: 3->1 Deleted dead store: D.3185.D.3097.x = x_2(D); ipa-modref: call stmt D.3199 = bitCount::bitCount_bitfield<1, int, glm::packed_highp> (&D.3185); [return slot optimization] ipa-modref: call to glm::vec<L, int, Q> bitCount::bitCount_bitfield(const glm::vec<L, T, Q>&) [with int L = 1; T = int; glm::qualifier Q = glm::packed_highp]/8 does not use ref: D.3185 alias sets: 3->3 Deleted dead store: D.3185 ={v} {CLOBBER}; Now the modref summary for function is loads: Limits: 32 bases, 16 refs Base 0: alias set 5 Ref 0: alias set 5 access: Parm 0 param offset:0 offset:0 size:32 max_size:32 alias set 5 correspond to const struct vec but diferent instantiation than alias set 3 used in the store. There is reinterpret cast: glm::vec<L, typename glm::detail::make_unsigned<T>::type, Q>x(*reinterpret_cast<glm::vec<L, typename glm::detail::make_unsigned<T>::type, Q> const *>(&v)); turning it to glm::vec<L, typename glm::detail::make_unsigned<T>::type, Q> x(*(&v)); makes the aliasing difference go away. So it seems to me that the testcase simply includes TBAA violation?