I noticed the default capture mode and the discriminator both used ints. That seems excessive. This shrinks them to 8 bits and 16 bits respectively. I suppose the discriminator could use the remaining 24 bits of an int allocation unit, if we're worried about more that 64K lambdas per function. I know, users are strange :) On a 64 bit system this saves 64 bits, because we also had 32 bits of padding added.

            gcc/cp/
            * cp-tree.h (struct tree_lambda_expr): Shrink
            default_capture_mode & discriminator.


--
Nathan Sidwell
diff --git i/gcc/cp/cp-tree.h w/gcc/cp/cp-tree.h
index a599f3bad1c..d43c53ae29a 100644
--- i/gcc/cp/cp-tree.h
+++ w/gcc/cp/cp-tree.h
@@ -1442,8 +1442,8 @@ struct GTY (()) tree_lambda_expr
   tree extra_scope;
   vec<tree, va_gc> *pending_proxies;
   location_t locus;
-  enum cp_lambda_default_capture_mode_type default_capture_mode;
-  int discriminator;
+  enum cp_lambda_default_capture_mode_type default_capture_mode : 8;
+  short int discriminator;
 };
 
 /* Non-zero if this template specialization has access violations that

Reply via email to