On 09/20/2018 08:08 AM, Vincent Lefevre wrote:
On 2018-09-17 10:03:48 -0600, Martin Sebor wrote:
On 09/17/2018 06:00 AM, Umesh Kalappa wrote:
Hi All,

When we try to compile the below case from trunk gcc we get the below
warning (-Wconversion) i.e

void start(void) {
 char n = 1;
 char n1 = 0x01;
 n &=  ~n1;
}

$xgcc -S  warn.c -nostdinc -Wconversion
 warning: conversion from ‘int’ to ‘char’ may change value [-Wconversion]
  n &=  ~n1;
[...]
It looks like a bug to me.

Declaring n1 const avoids the warning at -O2 but in C but not
at -O0.

Perhaps at some optimization level, GCC determines that the
expression is safe (thus no longer emits the warning), i.e.
that n & ~n1 is necessarily representable in a char.

That doesn't seem quite right -- GCC determines the
type of the bitwise AND expression to be different between
the optimization levels.

No, the type of this AND expression is always int. The question
is whether this int is necessarily representable in a char.

In C++, declaring n1 const avoids the warning regardless of
optimization levels.

If the constant propagation is done at -O0, this could explain
the behavior.

Or do you mean that GCC remembers the type the data come from,
i.e. assuming char is signed, if n1 is of type char, then ~n1
is necessarily representable in a char, thus can be regarded
as of being of type char in its analysis?

What I'm saying is that the type that determines whether or
not to issue a warning in this case is computed in
the shorten_binary_op() function.  The function is passed
the operands of the &= expression and returns the expression's
"new" type.  When n1's value is known (i.e., when it's const
and with -O2) and fits in char, and when n's type is the  char
(or under a bunch of other conditions), the function returns
the type char.  Comments in the code indicate it's
an optimization.  That may be fine as far as code correctness
goes but it doesn't seem quite right or robust to me because
it makes the warning appear inconsistent, both between
languages, and in C, between optimization levels.  Delaying
the warning until a later stage (e.g., until folding as Jason
suggested) would make it more consistent.  It wouldn't solve
all problems (e.g., it would still be prone to false positives
in unreachable code), but solving those by delaying it even
further could easily lead to others.

Martin

Reply via email to