http://gcc.gnu.org/bugzilla/show_bug.cgi?id=46633
Summary: [meta-bug] frontends use BITS_PER_UNIT when they mean TYPE_PRECISION (char_type_node) Product: gcc Version: 4.6.0 Status: UNCONFIRMED Severity: normal Priority: P3 Component: other AssignedTo: unassig...@gcc.gnu.org ReportedBy: amyl...@gcc.gnu.org Blocks: 46489 In some places the front ends use BITS_PER_UNIT when they mean TYPE_PRECISION (char_type_node). E.g. consider this code from c-family/c-common.c:fix_string_type : else if (TREE_TYPE (value) == char16_array_type_node) { nchars = length / (TYPE_PRECISION (char16_type_node) / BITS_PER_UNIT); e_type = char16_type_node; On a bit-addressed architecture, you would have BITS_PER_UNIT == 1, but probably TYPE_PRECISION (char_type_node) == 8.