Al Chou wrote:
So I pulled out Herr Pietschmann's Brent method class and tested it, and it
threw an exception telling me, "Possibly multiple zeros in interval or ill
conditioned function."

Caused by an incomplete and much too naive implementation. I have now a real implementation of Brent (Brent-Dekker) ready and could try to submit a patch over the weekend.

 - It's easy to outsmart yourself and create code that's too finicky for
non-numericist users.

Non-numericists (or whatever) tend to underestimate the traps in numericals calculation because the vast majority of the problems behave well with modern algorithms most of the time. Unfortunately, unforseen misbehaviour tends to come up at the worst possible time, often with the user barely noticing that something was wrong. In particular for root finding: - The function for which a zero is sought could be implemented badly, with excessive round-off error and/or bit-cancellation, like naive evaluation of dense high order polynominals. This may significantly displace the zero point, and it often leads to multiple numerical roots where only one was analytically expected. - The function may be inherently or numerically ill conditioned, like x*sin(1/x) near zero or ((x-1)^1000)*x^50 for a 50 bit mantissa. - It's hard to know in advance when to trade the performance for robustnesss. A criterium for root finders is how often the function is evaluated, and it is generally assumed this is a expensive compared to any calculation the solver could make. This can make a difference between bisection, which gives a bit per evaluation and needs ~53 iterations for an improvement of 10E-16 in accuracy, whether the function is well behaved or not, and Newton, which ideally doubles the correct bits per evaluation and needs ~5 iterations (evaluating of *two* functions) for a 10E-16 improvement. Obviously, if accuracy matters and function evaluation is slow, fast algorithms are hard to avoid but precisely defining the necessary accuracy and telling what is "slow" can be time consuming and hair-rising. - Detailed knowledge about the function (and other aspects of the problem) beats all kind of clever guesses by sophisitcated solving engines all the time. Most algorithms are only really robust if you can provide a bracket for the zero. For general functions, this is as hard or harder than nailing down the root itself. If you know the function has a smooth second derivative and no zero in the first derivative in a certain interval (like x>1) just use newton, if necessary with a numerical derivative, or the secant method without bracketing and you'll get your root, if it exists.

J.Pietschmann


--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to