On Monday, 21 March 2016 at 22:29:46 UTC, Steven Schveighoffer wrote:
It depends on the situation. foo may know that x is going to be short enough to fit in an int.

The question becomes, if 99% of cases the user knows that he was converting to a signed value intentionally, and in the remaining 1% of cases, 99% of those were harmless "errors", then this is going to be just a nuisance update, and it will fail to be accepted.

My experimentation strongly suggests that your "99.99% false positive" figure is way, *way* off. This stuff is both:

1) Harder for people to get right than you think (you can't develop good intuition about the extent of the problem, unless you spend some time thoroughly auditing existing code bases specifically looking for this kind of problem), and also

2) Easier for the compiler to figure out than you think - I was really surprised at how short the list of problems flagged by the compiler was, when I tested Lionello Lunesu's work on the current D codebase.

The false positive rate would certainly be *much* lower than your outlandish 10,000 : 1 estimate, given a good compiler implementation.

With respect to your specific example:

1) The memory limit on a true 32-bit system is 4GiB, not 2GiB. Even with an OS that reserves some of the address space, as much as 3GiB or 3.5GiB
may be exposed to a user-space process in practice.

Then make it long len = x.length on a 64-bit system.

Only reason I said assume it's 32-bit, is because on 64-bit CPU, using int is already an error. The architecture wasn't important for the example.

Huh? The point of mine which you quoted applies specifically to 32-bit systems. 32-bit array lengths can be greater than `int.max`.

Did you mean to reply to point #3, instead?

Reply via email to