On May 1, 2009, at 7:58 PM, Thomas Savitsky wrote:

> I looked into global optimizers, and scipy's anneal is not the  
> answer.  It gives poor results quite often.  A better choice would  
> be ASA, located at http://alumnus.caltech.edu/~ingber/#ASA  
> Unfortunately, there are functions which give this (and other)  
> optimizer problems, such as the Rosenbrock function: (1-x)^2 + 100  
> (y - x^2)^2.  Nevertheless, a numerical approximation for a global  
> minimum would benefit sage imo.  But if it were added, how should  
> the end user be informed that the answer given may very wrong, and  
> significant effort spent tuning options to the optimizer may be  
> required for a good result?
>


I quite like the ASA optimizer and I used a MATLAB interface to it
for my Master's thesis. However, I'm fairly certain the license isn't
GPL compatible. It would be a nice optional package, though.

Cheers,

Tim.

---
Tim Lahey
PhD Candidate, Systems Design Engineering
University of Waterloo
http://www.linkedin.com/in/timlahey

--~--~---------~--~----~------------~-------~--~----~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to 
sage-devel-unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://www.sagemath.org
-~----------~----~----~----~------~----~------~--~---

Reply via email to