rkevinburton wrote:
> 
> Thank you I had not considered using "gradient" in this fashion. Now as an
> add on question. You (an others) have suggested using SANN. Does your
> answer change if instead of 100 "variables" or bins there are 20,000? From
> the documentation L-BFGS-B is designed for a large number of variables.
> But maybe SANN can handle this as well.
> 
> Kevin
> 

  It's a question of time and space.  Try the problem for 100, 500, 1000 ...
variables to 
see how the memory usage and time scale.  (At a guess, memory will be linear
in N and
not too bad, time will be horrible.) I haven't followed the thread very
carefully, if any
of the linear programming solutions solve your problem they will be far more
efficient.
It sounds as  though you have an extremely non-trivial optimization problem
here, the
brute-force approach exemplified by SANN may not work, so you will have to
map
your problem onto a framework (such as linear programming) that strives for
efficiency rather than generality. (L-BFGS-B is out of the question.)

  Essentially, this is turning into an optimization problem rather than an R
problem.
Once you know that there exists an optimization approach that can solve your
problem before the sun burns out, you can come back and find out if anyone
has implemented it in R (or RSiteSearch() for it ...), or implemented an
interface with a lower-level platform.

  Good luck,
    Ben Bolker
-- 
View this message in context: 
http://www.nabble.com/Constrined-dependent-optimization.-tp22772520p22834746.html
Sent from the R help mailing list archive at Nabble.com.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to