On Wed, 26 Feb 2014 21:18:01 -0500, Bruce A Johnson wrote:
On Feb 26, 2014, at 6:23 PM, Bruce A Johnson <johns...@umbc.edu> wrote:

The NonLinearConjugateGradientOptimizer does a line search for a zero in the gradient (see comment from source below), rather than a search for a minimum of the function (the latter is what is used in Numerical Recipes and in the simple discussion on Wikipedia ( http://en.wikipedia.org/wiki/Nonlinear_conjugate_gradient_method). Is this wise? It seems a clever idea, but in a complicated surface with numerical errors the zero in the gradient may not be at a function minimum and the algorithm could be a deoptimizer. I ask because (in a problem too complex too easily reproduce) I'm sometimes getting junk as output of this routine.

Bruce

Comment for the LIneSearchFunction

350 * The function represented by this class is the dot product of 351 * the objective function gradient and the search direction. Its 352 * value is zero when the gradient is orthogonal to the search 353 * direction, i.e. when the objective function value is a local
354     * extremum along the search direction.

Just realized, in reviewing all open bugs, that this has already been
reported as Math-1092 (
https://issues.apache.org/jira/browse/MATH-1092 )

I agree with the assignment priority, this is a Major bug.

I've attached a patch for this issue.

Let me know whether it's OK to apply it.
[Among othet things, it also obsoletes the inner class "BracketingStep".]

Regards,
Gilles


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
For additional commands, e-mail: dev-h...@commons.apache.org

Reply via email to