Hi all,

during the review of some commits that are in the process of being
upstreamed from Chrome OS, people noticed that chipset drivers like to
define their own TRUE/FALSE defines (sometimes prefixed to), and I
have seen a bunch of #define BIT{0-31} ..., too, because that seems to
be the house rules in some firmware communities.

I think we should seek uniformity here: decide on some style,
recommend it, clean up the tree to match, and help people stay
consistent through lint tests. What I don't know however is what that
style should look like.

So, two topics:

1. TRUE/FALSE
Do we want such defines? If so, TRUE/FALSE, or true/false, or
True/False, or ...?

2. BIT16 vs BIT(16) vs (1 << 16) vs 0x10000
I don't think it makes sense to go for a single one of these (0x3ff is
certainly more readable than BIT11 | BIT10 | BIT9 | BIT8 | BIT7 | BIT8
| BIT5 | BIT4 | BIT3 | BIT2 | BIT1 | BIT 0), but I doubt we need both
BIT16 and BIT(16).


Patrick
-- 
Google Germany GmbH, ABC-Str. 19, 20354 Hamburg
Registergericht und -nummer: Hamburg, HRB 86891, Sitz der Gesellschaft: Hamburg
Geschäftsführer: Matthew Scott Sucherman, Paul Terence Manicle

-- 
coreboot mailing list: coreboot@coreboot.org
http://www.coreboot.org/mailman/listinfo/coreboot

Reply via email to