Dear all,
In my attempt to run the below modelling command in R 2.7.0 under windows XP (4GB RAM with /3GB switch set) I receive the following error: Error: cannot allocate vector of size 1.5 Gb I have searched a bit and have tried adding: --max-mem-size=3071M to the command line (when set to 3G I get the error that 3072M is too much) I also run: > memory.size() [1] 11.26125 > memory.size(max=T) [1] 13.4375 Modelling script: model.females <- quote(gamlss(WAZ11~cs(sqrtage,df=12)+country, sigma.formula=~cs(sqrtage,df=3)+country, nu.formula=~cs(sqrtage,df=1), tau.formula=~cs(sqrtage,df=1), data=females, family=BCPE, control=con)) fit.females <- eval(model.females) the females (1,654KB) that is being modelled by the GAMLSS package contains 158,533 observations I have further installed various memory optimization programs under XP but to no avail. I believe that I perhaps need to set the Vcells and Ncells but am not sure which nor to what limits. Any other help in maximizing my RAM usage in R would be great I am quite a novice so please excuse any obvious mistakes or omissions. Thank you in advance for your help Dr. Daniel Hayes [[alternative HTML version deleted]] ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.