On 01 Mar 2014, at 03:06, Hans-Peter Diettrich wrote:
Numerical constants, where the sign matters, should only be encoded in
decimal. The other formats (hex,oct,bin...) are intended for use with binary
values, where the bit pattern is important. Then the code compiles correctly
on any
On 01 Mar 2014, at 01:19, Ewald wrote:
On 28 Feb 2014, at 23:43, Jonas Maebe wrote:
Because of the (unPascalish) decision to have an unsigned version of the
largest supported integer type, there are indeed some cases that require
decisions to define the behaviour
That is perfectly