Jérôme M. Berger Wrote:

> Steven Schveighoffer wrote:
> > When we're talking about the difference between O(1) and O(lgn), I'll
> > take accuracy over speed in my compiler any day.
>       And when we're talking about the difference between 10s and 55s for
> a minimal loss of accuracy, which will you take? Especially if the
> accuracy loss is less than is lost elsewhere (due to holes in the
> ranges).

Really?  You rebuilt the compiler with your range propagation algorithm and 
verified that it adds 10 seconds versus an accurate one that adds 55s?  How 
much time did the compiler spend to compile?  I'd hazard to guess that a code 
base that adds 10s worth of your algorithm takes at least a few hours to 
compile.  Is 55s that bad at that point?

Again, if it takes the compiler an extra insignificant amount of time to be 
more accurate, I'm all for accuracy over speed when you get down to that level 
of insignificance.  I'd say the point of pain has to be at least 10% of the 
compile time before it makes any sizable difference.

-Steve

Reply via email to