On 11/10/2011 12:33 PM, foobar wrote:
1. This makes sense on 32bit platforms. What about 64bit platforms?

D works fine on 64 bit platforms.


2. What about int(128), int(16), int(8)?  for the non-default (!= 32) you do
want to know the size. in what way short and byte better?

Short and byte have many decades of use being 16 bits and 8 bits. I don't know anyone who is confused by this.


3. The default
optimal size should indeed have an alias such as 'int'.


A much better scheme IMO is to define as general type and predefine an easy
to remember alias for the default (32). Wait a sec, that's what chapel did..
The problem isn't with "int" but rather with the proliferation of the other
types for all the combinations of sign-ness and size.

I think Chapel has solved a non-existent problem.

It's not like nobody has thought of declaring integers that way before. They have, over and over, for decades. It's just that few like the result.

You're also free to add the following to your code:

alias byte int8;
alias short int16;
alias int int32;

If you feel it improves your code, go for it.


Reply via email to