In the sort of problem mentioned below, the suggestion to put in gradients (I 
believe this
is what is meant by "minus score vector") is very important. Using analytic 
gradients is
almost always a good idea in optimization of smooth functions for both 
efficiency of
computation and quality of results.

Also users may want to use either updated codes (Rvmmin is BFGS algorithm with 
box
constraints; ucminf does it unconstrained) or different approaches, depending 
on the
function. Package optimx lets users discover relative properties of different 
optimizers
on their class of problems.

John Nash



> From: Dimitris Rizopoulos <d.rizopou...@erasmusmc.nl>
> To: justin bem <justin_...@yahoo.fr>
> Cc: R Maillist <r-h...@stat.math.ethz.ch>
> Subject: Re: [R] Fitting GLM with BFGS algorithm
> Message-ID: <4cc6c0db.9070...@erasmusmc.nl>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
> 
> for instance, for logistic regression you can do something like this:
> 
> # simulate some data
> x <- cbind(1, runif(100, -3, 3), rbinom(100, 1, 0.5))
> y <- rbinom(100, 1, plogis(c( x%*% c(-2, 1, 0.3))))
> 
> # BFGS from optim()
> fn <- function (betas, y, x) {
>    -sum(dbinom(y, 1, plogis(c(x %*% betas)), log = TRUE))
> }
> optim(rep(0, ncol(x)), fn, x = x, y = y, method = "BFGS")
> 
> # IWLS from glm()
> glm(y ~ x[, -1], family = "binomial")
> 
> You can also improve it by providing the minus score vector as a third 
> argument to optim().
> 
> 
> I hope it helps.
> 
> Best,
> Dimitris

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to