2010/11/22 Gael Varoquaux <gael.varoqu...@normalesup.org>:
> On Mon, Nov 22, 2010 at 09:12:45PM +0100, Matthieu Brucher wrote:
>> Hi ;)
>
> Hi bro
>
>> > does anybody have, or knows where I can find some N dimensional
>> > dichotomy optimization code in Python (BSD licensed, or equivalent)?
>
>> I don't know any code, but it should be too difficult by bgoing
>> through a KdTree.
>
> I am not in terribly high-dimensional spaces, so I don't really need to
> use a KdTree (but we do happen to have a nice BallTree available in the
> scikit-learn, so I could use it just to  play :>).

:D

>> In this case, you may want to check Nelder-Mead algotihm (also known
>> as down-hill simplex or polytope), which is available in
>> scikits.optimization, but there are other implementations out there.
>
> Interesting reference. I had never looked at the Nelder-Mead algorithm.
> I am wondering if it does what I want, thought.
>
> The reason I am looking at dichotomy optimization is that the objective
> function that I want to optimize has local roughness, but is globally
> pretty much a bell-shaped curve. Dichotomy looks like it will get quite
> close to the top of the curve (and I have been told that it works well on
> such problems). One thing that is nice with dichotomy for my needs is
> that it is not based on a gradient, and it works in a convex of the
> parameter space.

It seems that a simplex is what you need. It uses the barycenter (more
or less) to find a new point in the simplex. And it works well only in
convex functions (but in fact almost all functions have an issue with
this :D)

> Will the Nelder-Mead display such properties? It seems so to me, but I
> don't trust my quick read through of Wikipedia.

Yes, it does need a gradient and if the function is convex, it works
in a convex in the parameter space.

> I realize that maybe I should rephrase my question to try and draw more
> out of the common wealth of knowledge on this mailing list: what do
> people suggest to tackle this problem? Guided by Matthieu's suggestion, I
> have started looking at Powell's algorithm, and given the introduction of
> www.damtp.cam.ac.uk/user/na/NA_papers/NA2007_03.pdf I am wondering
> whether I should not investigate it. Can people provide any insights on
> these problems.

Indeed, Powell may also a solution. A simplex is just what is closer
to what you hinted as an optimization algorithm.

> Many thanks,

You're welcome ;)

> Gael
>
> PS: The reason I am looking at this optimization problem is that I got
> tired of looking at grid searches optimize the cross-validation score on
> my 3-parameter estimator (not in the scikit-learn, because it is way too
> specific to my problems).

Perhaps you may want to combine it with genetic algorithms. We also
kind of combine grid search with simplex-based optimizer with
simulated annealing in some of our global optimization problems, and I
think I'll try at one point to introduce genetic algorithms instead of
the grid search. Your problem is simpler though if it displays some
convexity.

Matthieu
-- 
Information System Engineer, Ph.D.
Blog: http://matt.eifelle.com
LinkedIn: http://www.linkedin.com/in/matthieubrucher
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to