On 2023-07-31 13:03, Siddhesh Poyarekar wrote:
On 2023-07-31 12:47, Qing Zhao wrote:
Hi, Sid and Jakub,

I have a question in the following source portion of the routine “addr_object_size” of gcc/tree-object-size.cc:

  743       bytes = compute_object_offset (TREE_OPERAND (ptr, 0), var);
  744       if (bytes != error_mark_node)
  745         {
  746           bytes = size_for_offset (var_size, bytes);
  747           if (var != pt_var && pt_var_size && TREE_CODE (pt_var) == MEM_REF)
  748             {
  749               tree bytes2 = compute_object_offset (TREE_OPERAND (ptr, 0),
  750                                                    pt_var);
  751               if (bytes2 != error_mark_node)
  752                 {
  753                   bytes2 = size_for_offset (pt_var_size, bytes2);
  754                   bytes = size_binop (MIN_EXPR, bytes, bytes2);
  755                 }
  756             }
  757         }

At line 754, why we always use “MIN_EXPR” whenever it’s for OST_MINIMUM or not?
Shall we use

(object_size_type & OST_MINIMUM
                             ? MIN_EXPR : MAX_EXPR)


That MIN_EXPR is not for OST_MINIMUM.  It is to cater for allocations like this:

typedef struct
{
   int a;
} A;

size_t f()
{
   A *p = malloc (1);

   return __builtin_object_size (p, 0);

Correction, that should be __builtin_object_size (&p->a, 0)

}

where the returned size should be 1 and not sizeof (int).  The mode doesn't really matter in this case.

HTH.

Sid

Reply via email to