On Thu, Aug 10, 2023 at 05:44:20PM -0400, Jason Merrill wrote:
> Hmm, you really think there's any code at all in the wild relying on
> __builtin_classify_type + array/function decay?  It's a (previously)

Looking at the first uses of the builtin back in 90s in va*.h, it certainly
relied on array/function decay there (the macros would abort e.g. on
array_type_class, function_type_class and various other return values).
Looking at older versions of tgmath.h, I see just checks for 8/9 (i.e.
real/complex) and those woiuldn't be affected by any promotions/decay.
But newer versions of tgmath.h before __builtin_tgmath do check also for
1 and they would be upset if char wasn't promoted to int (including latest
glibc).
systemtap macros also use __builtin_classify_type and do check for pointers
but those seems to be prepared to handle even arrays.

> undocumented built-in, I wouldn't expect anyone outside the project to be
> using it.  So at first glance I'd be inclined to fix it whether or not we
> also allow it to accept a type.  But I don't actually know how it's used, so
> could well be wrong...

The problem with changing the existing builtin after 35+ years especially
when it was poorly documented:
 -- Macro: __builtin_classify_type (OBJECT)
     Since each machine has its own conventions for which data types are
     passed in which kind of register, your implementation of 'va_arg'
     has to embody these conventions.  The easiest way to categorize the
     specified data type is to use '__builtin_classify_type' together
     with 'sizeof' and '__alignof__'.

     '__builtin_classify_type' ignores the value of OBJECT, considering
     only its data type.  It returns an integer describing what kind of
     type that is--integer, floating, pointer, structure, and so on.

     The file 'typeclass.h' defines an enumeration that you can use to
     interpret the values of '__builtin_classify_type'.
is that users then follow what the implementation actually does rather than
what is documented (because it isn't), other compilers implemented the
behavior GCC had and while as the above list shows some macros/headers could
cope with, others can break.  And, even if the array/function decay is
avoided, one still wouldn't be able to get various type classes because of
other decays/argument conversions.

> > @@ -7850,6 +7851,50 @@ cp_parser_postfix_expression (cp_parser
> >               = parser->non_integral_constant_expression_p;
> >             parser->integral_constant_expression_p = false;
> >           }
> > +       else if (TREE_CODE (stripped_expression) == FUNCTION_DECL
> > +                && fndecl_built_in_p (stripped_expression,
> > +                                      BUILT_IN_CLASSIFY_TYPE))
> > +         {
> > +           /* __builtin_classify_type (type)  */
> > +           auto cl1 = make_temp_override
> > +                        (parser->type_definition_forbidden_message,
> > +                         G_("types may not be defined in "
> > +                            "%<__builtin_classify_type%> calls"));
> > +           auto cl2 = make_temp_override
> > +                        (parser->type_definition_forbidden_message_arg,
> > +                         NULL);
> > +           auto cl3 = make_temp_override (parser->in_type_id_in_expr_p,
> > +                                          true);
> > +           cp_evaluated ev;
> > +           ++cp_unevaluated_operand;
> > +           ++c_inhibit_evaluation_warnings;
> 
> These three lines seem unnecessary for parsing a type.
> 
> > +           tentative_firewall firewall (parser);
> 
> I think you only need a tentative_firewall if you're going to call
> cp_parser_commit_to_tentative_parse yourself, which you don't.

I think I've just copied this from elsewhere, will double check in the
morning which ones aren't really needed.

        Jakub

Reply via email to