Hi Mathew, You could use Newton's method to optimize for each vi sequentially. If you have an expression for the jacobian, it's even better.
What I'd do is write a class with a method f(self, x, y) that records the result of f(x,y) each time it is called. I would then sample very coarsely the x,y space where I guess my solutions are. You can then select the x,y where v1 is maximum as your initial point for Newton's method and iterate until you converge to the solution for v1. Since during the search for the optimum your class stores the computed points, your initial guess for v2 should be a bit better than it was for v1, which should speed up the convergence to the solution for v2, etc. If you have multiple processors available, you can scatter function evaluation among them using ipython. It's easier than it looks. Hope someone comes up with a nicer solution, David On Wed, May 6, 2009 at 3:16 PM, Mathew Yeates <myea...@jpl.nasa.gov> wrote: > I have a function f(x,y) which produces N values [v1,v2,v3 .... vN] > where some of the values are None (only found after evaluation) > > each evaluation of "f" is expensive and N is large. > I want N x,y pairs which produce the optimal value in each column. > > A brute force approach would be to generate > [v11,v12,v13,v14 ....] > [v21,v22,v23 ....... ] > etc > > then locate the maximum of each column. > This is far too slow ......Any other ideas? > > > > _______________________________________________ > Numpy-discussion mailing list > Numpy-discussion@scipy.org > http://mail.scipy.org/mailman/listinfo/numpy-discussion >
_______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion