On Monday, 4 August 2014 at 02:31:36 UTC, John Carter wrote:
On Monday, 4 August 2014 at 02:18:12 UTC, David Bregman wrote:

His post basically says that his real life experience leads him to believe that a static analyzer based on using information from asserts will very likely generate a ton of warnings/errors, because real life code is imperfect.

In other words, if you use that information to optimize instead, you are going to get a ton of bugs, because the asserts are inconsistent with the code.

No.

My experience says deeper optimization comes from deeper understanding of the dataflow, with deeper understanding of the dataflow comes stricter warnings about defective usage.

Yes, that isn't what is being proposed though. This is about optimization, not warnings or errors.

ie. A Good compiler writer, as Walter and the gcc guys clearly are, don't just slap in an optimization pass out of nowhere.

They are all too painfully aware that if their optimization pass breaks anything, they will be fending off thousands of complaints that "Optimization X broke....".

If you read the earlier threads, you will see Walter freely admits this will break code. Actually he says that such code is already broken. This doesn't involve new warnings, it will just break silently. It would be very difficult to do otherwise (see Daniel Gibson's reply to your post).

Reply via email to