I wonder if it might be better to instead reject VLAs in constexpr
functions altogether.  Not because they're not in C++, but because
C (or gcc) doesn't allow them to be initialized (and so accepting
an initialized VLA is a g++ extension of an extension), and
because in constexpr functions they are rejected without
initialization just like other uninitialized variables.

I don't think we can do this at this time.  E.g. the following program works
even with GCC 5 and -std=c++14:

constexpr int
foo (int n)
{
     int a[n] = { 1, 2, 3 };
     int z = 0;
     for (unsigned i = 0; i < 3; ++i)
       z += a[i];
     return z;
}

int
main ()
{
   constexpr int n = foo (3);
   __builtin_printf ("%d\n", n);
}

This happens to work but I suspect it's only by accident.  When
the number of elements in the initializer is increased to exceed
the number of elements in the VLA GCC gets into infinite recursion.
(I opened bug 69516 with a test case).  The same error in a non-
constexpr function causes a SEGV at runtime (this is also
a regression WRT 4.9.3 -- I opened bug 69517 for it).

So starting to reject such a code might broke working programs.  And we're
able to reject non-standard code: -pedantic-errors.

I agree that there is some risk that it might break some working
programs.  I would expect the most common use of initialized VLAs
be to set all elements to zero using the "= { }" or "= { 0 }"
syntax.  Initializers with more elements are, IMO, likely to be
a bug where the user doesn't realize they defined a VLA rather
than an ordinary array.  Since VLAs are required to have at least
1 element, would diagnosing initializers with more than one element
more loudly (such as by default or with -Wall as opposed to with
-Wpedantic) be a good solution?

Martin

Reply via email to