I don't see why 10_000_000_000 could not be an **int** on a 64 bits platform 
and produce an error on a 32 bits platform. As there is always the possibility 
to write 10_000_000_000 'i64, this is not a restriction.

Furthermore, it would make things consistent as const c = 10 * 1_000_000_000 
would give an **int** on a 64 bits machine and an error on a 32 bits machine 
(or, is it that it would give an **int64**? I don 't have a 32 bits machine to 
check this but it seems unlikely).

But I will not fight about this point which is minor. As long as it is clearly 
stated that big literals are **int664** , I think I can live with that and use 
a conversion to get an **int** on 64 bits platforms . The real issue is this 
small inconsistency in the manual which deceived me.

Reply via email to