Hi, I am trying to do nonlinear minimization using nlm() function, but for large amount of data it is going out of memory.
Code which i am using: f<-function(p,n11,E){ sum(-log((p[5] * dnbinom(n11, size=p[1], prob=p[2]/(p[2]+E)) + (1-p[5]) * dnbinom(n11, size=p[3], prob=p[4]/(p[4]+E))))) } p_out <-nlm(f, p=c(alpha1= 0.2, beta1= 0.06, alpha2=1.4, beta2=1.8, w=0.1), n11=n11_c, E=E_c) When the size of n11_c or E_c vector is to large, it is going out of memory. please give me some solution for this. -- View this message in context: http://r.789695.n4.nabble.com/Memory-usage-problem-while-using-nlm-function-tp4701241.html Sent from the R help mailing list archive at Nabble.com. ______________________________________________ R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.