Here is a simple example (finding the parameters of a normal) that shows one way to do it:
x <- rnorm(25, 100, 5) tmpfun <- function(x) { steps <- matrix( 0:1, nrow=1 ) myfn <- function(param) { print(param) flush.console() steps <<- rbind(steps, param) -sum( dnorm(x, param[1], param[2], log=TRUE) ) } out <- optim( c(50,15), myfn ) list(out=out, steps=steps) } res <- tmpfun(x) res$out mean(x) sd(x)*sqrt(24/25) plot(res$steps, type='l') The function to be optimized is myfn which will print the current parameter values being tested and also add them to the steps matrix (it is cleanest to wrap all of this in a function like I did so that steps is a local variable rather than a global variable). The result returned by tmpfun has 2 components, the first (out) is just the output of optim, the other (steps) is a matrix with all the parameter values tested (with an extra row at the top that could be discarded). If you are including the gradient, then just save that in another variable like steps. For official documentation, you really need to read up on lexical scoping and ?'<<-'. Hope this helps, -- Gregory (Greg) L. Snow Ph.D. Statistical Data Center Intermountain Healthcare greg.s...@imail.org 801.408.8111 From: Shimrit Abraham [mailto:shimrit.sabra...@gmail.com] Sent: Tuesday, February 24, 2009 10:29 AM To: Greg Snow Cc: r-help@r-project.org; gmain.20.ph...@xoxy.net Subject: Re: [R] Tracing gradient during optimization Rob Steele suggested the same thing but I'm not sure I understand how to implement this exactly. Is there any documentation that you could suggest? This might be something that could be useful for the future. Thanks, Shimrit On Tue, Feb 24, 2009 at 5:09 PM, Greg Snow <greg.s...@imail.org<mailto:greg.s...@imail.org>> wrote: It looks like you found a solution, but if you find yourself in this situation again using optim, then one approach is to modify your function that you are optimizing (or write a wrapper for it) to produce the tracing information for you. -- Gregory (Greg) L. Snow Ph.D. Statistical Data Center Intermountain Healthcare greg.s...@imail.org<mailto:greg.s...@imail.org> 801.408.8111 > -----Original Message----- > From: r-help-boun...@r-project.org<mailto:r-help-boun...@r-project.org> > [mailto:r-help-boun...@r-<mailto:r-help-boun...@r-> > project.org<http://project.org>] On Behalf Of Shimrit Abraham > Sent: Tuesday, February 24, 2009 7:00 AM > To: r-help@r-project.org<mailto:r-help@r-project.org> > Subject: [R] Tracing gradient during optimization > > Hi everyone, > > I am currently using the function optim() to maximize/minimize > functions and > I would like to see more output of the optimization procedure, in > particular > the numerical gradient of the parameter vector during each iteration. > The documentation of optim() describes that the trace parameter should > allow > one to trace the progress of the optimization. > I use the following command: > > optim(par = vPar, > fn = calcLogLik, > method = "BFGS", > control = list(trace = TRUE, fnscale = -1, maxit = 2000)); > > which gives very little information: > > initial value 3.056998 > final value 2.978351 > converged > > Specifying trace >1, for instance trace = 20, does not result in more > information. Is there a way to view more details of the progress > perhaps by > using another optimizer? > > Thanks, > > Shimrit Abraham > > [[alternative HTML version deleted]] > > ______________________________________________ > R-help@r-project.org<mailto:R-help@r-project.org> mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting- > guide.html > and provide commented, minimal, self-contained, reproducible code. [[alternative HTML version deleted]] ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.