Hi...

Not sure if this is the right forum for this but here goes...

I am doing multi-dimensional minimization via conjugate gradients.  According 
to 
the GSL reference manual, these algorithms proceed by successive line 
minimizations.  Once it has converged along a given direction, it chooses a new 
direction in which to search.

My question is: what method is used for the line minimization?  Does the user 
have any control over this?  From the example at 


http://www.gnu.org/software/gsl/manual/html_node/Multimin-Examples.html

it looks like there is simply a step size that increases as we move downhill... 
eventually we overshoot the minimum, and then it  backtracks.  Is this right? 
I'm not sure why one would do this instead of some kind of Brent method or 
something based on parabolas.  Nor is it clear to me what it does during the 
backtracking step. And when there is a change of direction, what step size is 
used for the first step along that direction?

I have in mind a function with many minima and I am interested in how the 
minimum that is found depends on the starting point used.  I guess this depends 
on the implementation, so it would be useful if a few more details of the 
minimization algorithm were available somewhere.

Sorry if this is addressed somewhere and I missed it -- thanks for any help

Rob





_______________________________________________
Help-gsl mailing list
[email protected]
http://lists.gnu.org/mailman/listinfo/help-gsl

Reply via email to