I could have sworn reading a few years ago a white paper discussing signed vs 
unsigned discussed with Blink coding style showing that using unsigned had a 
performance impact. 

Of course, now I can’t find reference to it. 

But I clearly recall recommendations like you mentioned. 


Sent from my iPhone

> On 24 Jan 2023, at 9:00 pm, Myles Maxfield via webkit-dev 
> <webkit-dev@lists.webkit.org> wrote:
> 
> Hello!
> 
> I recently learned that the C++ core guidelines recommend against using 
> unsigned to avoid negative values. Section 4.4 on page 73 of The C++ 
> Programming Language says unsigned types should be used for bitfields and not 
> in an attempt to ensure values are positive. Some talks by people on the C++ 
> standards committee (e.g., Herb Sutter) recommend against using unsigned 
> types simply because the value is expected to by positive.
> 
> Should we be avoiding unsigneds for these purposes? WebKit uses unsigneds all 
> over the place, and I’m assuming a fair many of them are there to indicate 
> that negative values are avoided. The C++ recommendation goes against my 
> intuition that the type is there for clarity, to indicate expectations about 
> the meaning and behavior of its value. But if it’s standard practice to just 
> use int instead, perhaps we should update the style guide?
> 
> What do you think?
> 
> —Myles
> _______________________________________________
> webkit-dev mailing list
> webkit-dev@lists.webkit.org
> https://lists.webkit.org/mailman/listinfo/webkit-dev
_______________________________________________
webkit-dev mailing list
webkit-dev@lists.webkit.org
https://lists.webkit.org/mailman/listinfo/webkit-dev

Reply via email to