On 12/16/2012 2:52 PM, deadalnix wrote:
How much performances are sacrificed compared to a looser integer semantic (I
frankly don't know), and how much programs benefit from it (I suspect very
little, but I may be wrong) ?

Recall that C is standardized to work on 1's complement machines, which affects the details of the "undefined behavior" rules.

D is standardized to work only on 2's complement machines, so less is undefined.

Reply via email to