https://gcc.gnu.org/bugzilla/show_bug.cgi?id=104012

--- Comment #6 from Martin Sebor <msebor at gcc dot gnu.org> ---
To expand a bit on the fuzziness at level 1.  The logic is documented under the
-Wformat-overflow warning like so:

  Numeric arguments that are known to be bounded to a subrange of their type,
or string arguments whose output is bounded either by their directive’s
precision or by a finite set of string literals, are assumed to take on the
value within the range that results in the most bytes on output.

Reply via email to