https://gcc.gnu.org/bugzilla/show_bug.cgi?id=89052

            Bug ID: 89052
           Summary: excessive data segment size causes a hang
           Product: gcc
           Version: 9.0
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: middle-end
          Assignee: unassigned at gcc dot gnu.org
          Reporter: msebor at gcc dot gnu.org
  Target Milestone: ---

Making a silly mistake and using the wrong macro such as SIZE_MAX instead of
MAX_SIZE as the array size when defining a global object seems to cause as to
hang presumably while trying to allocate huge amounts of memory.  It's not
immediately obvious what the problem might be in such a case.

GCC already warns when the size of any single object exceeds some maximum but
it could help prevent this kind of a mistake by computing the size of all
global objects (or even the size of local objects in each function) and issuing
a warning such as -Wlarger-than= or even an error when the total size of all of
them exceeds the same limit.  The current permissive limit is PTRDIFF_MAX but I
don't think there are too many 64-bit systems that can actually handle that
much memory so using a lower limit lower would be even better.

$ (cat a.c && ulimit -v 10000 && gcc -Wall -c a.c)
#include <stdint.h>

#define MAX_SIZE 32

char a[SIZE_MAX / 2] = "1";   // typo: should have been MAX_SIZE / 2
char b[SIZE_MAX / 2] = "x";   // ditto

xgcc: internal compiler error: Segmentation fault signal terminated program cc1
Please submit a full bug report,
with preprocessed source if appropriate.
See <https://gcc.gnu.org/bugs/> for instructions.

Reply via email to