On Tue, Nov 23, 2010 at 3:37 AM, Sebastian Walter < [email protected]> wrote:
> On Tue, Nov 23, 2010 at 11:17 AM, Gael Varoquaux > <[email protected]> wrote: > > On Tue, Nov 23, 2010 at 11:13:23AM +0100, Sebastian Walter wrote: > >> I'm not familiar with dichotomy optimization. > >> Several techniques have been proposed to solve the problem: genetic > >> algorithms, simulated annealing, Nelder-Mead and Powell. > >> To be honest, I find it quite confusing that these algorithms are > >> named in the same breath. > > > > I am confused too. But that stems from my lack of knowledge in > > optimization. > > > >> Do you have a continuous or a discrete problem? > > > > Both. > > > >> Is your problem of the following form? > > > >> min_x f(x) > >> s.t. lo <= Ax + b <= up > >> 0 = g(x) > >> 0 <= h(x) > > > > No constraints. > > didn't you say that you operate only in some convex hull? > > > > >> An if yes, in which space does x live? > > > > Either in R^n, in the set of integers (unidimensional), or in the set of > > positive integers. > According to http://openopt.org/Problems > this is a mixed integer nonlinear program http://openopt.org/MINLP . I > don't have experience with the solver though, > but it may take a long time to run it since it uses branch-and-bound. > In my field of work we typically relax the integers to real numbers, > perform the optimization and then round to the next integer. > This is often sufficiently close a good solution. > > I've sometimes applied a genetic algorithm to the rounding, which can improve things a bit if needed. Chuck
_______________________________________________ NumPy-Discussion mailing list [email protected] http://mail.scipy.org/mailman/listinfo/numpy-discussion
