Gabriel Dos Reis wrote:
> Furthermore, I've read people suggesting that we are gratuitously
> broking code.  That is misleading.  The code was invoking undefined
> behaviour and, before, we did not make any explicit guarantee about
> the semantics. 
> It is one thing to argue for changing gear; but, one should try to
> stay in "honest bounds".
>
> | I'm not exactly opposed to that, but I do
> | wonder if it's the best use of people's time.  But this is free
> | software, and people choose their own priorities.
>
> Yup.  We just have to make sure we don't end up in a mess.
>
> -- Gaby
>   

What I actually trying to get attention to, is the fact the we make
undefined behavior
even more undefined in *some* cases while leaving others intact.
Moreover the semantic
of check was already changed for 4.2 to hide taht ICE further -
twice-casted case as well.
And current behavior does runtime-trap too much code shooting not only
ICE but
well-behave code too. Actually all casts of not-compatible according to
standard pointers
is transformed to trap, even the sane convertion like from 'const char
*' to 'char *'.

Suppose un future when the optimizer will become smart enough to inline
assign-casted
function call that ICE'll raise again.
Is it reasonable to tight check again and again?

Reply via email to