2011/11/23 annaykay <[email protected]>: > Hello everyone, > I am working on the optimization of some model parameters in my simulation, > which simulates the impact of communication on the attitude towards a > specific topic. > I want to optimize 17 parameters in a non-linear function to get a minimal > error-value. Therefor I implemented two different Optimization-Algorithms in > my Simulation. > First I tried the Nelder-Mead Algorithm. But in this case I have the > problem, that first my error-value increases, then decreases again and > starts to stagnate on a non-satisfying value. Is this even possible for the > Nelder-Mead method, that the error increases however I want to minimize it? > Then I also tried a different Optimization-Algorithm, the > Levenberg-Marquardt-Algorithm. Here the problem is that the changes in the > parameters are too small, so that the optimization already stops after one > iteration. > Do you maybe have an idea about approaching this problem or do you know a > different Optimization-Algorithm that could suit my problem? > > Thanks in advance! > > > -- > View this message in context: > http://apache-commons.680414.n4.nabble.com/math-Optimization-Nelder-Mead-and-Levenberg-Marquardt-tp4099186p4099186.html > Sent from the Commons - User mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: [email protected] > For additional commands, e-mail: [email protected] > >
Hi, If the function you want to optimize has several local maxima, then it is (almost) always problematic. Especially with 17 parameters, that is a lot. Are you sure that you cannot obtain an analytical solution? Have you tried different starting values for the Nelder-Mead Algorithm? Cheers, Mikkel. --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
