It would be helpful to produce a script that reproduces the error on =20 your system. And include details on the size of your data set and =20 what you are doing with it. It is unclear what function is actually =20 causing the error and such. Really, in order to do something about it =20=
you need to show how to actually obtain the error. To my knowledge nothing _major_ has happened with the memory =20 consumption, but of course R could use slightly more memory for =20 specific purposes. But chances are that this is not really memory related but more =20 related to the functions your are using - perhaps a bug or perhaps a =20 user error. Kasper On Nov 6, 2006, at 10:20 AM, Derek Stephen Elmerick wrote: > thanks for the friendly reply. i think my description was fairly =20 > clear: i > import a large dataset and run a model. using the same dataset, the > process worked previously and it doesn't work now. if the new =20 > version of R > requires more memory and this compromises some basic data analyses, =20= > i would > label this as a bug. if this memory issue was mentioned in the > documentation, then i apologize. this email was clearly not well =20 > received, > so if there is a more appropriate place to post these sort of =20 > questions, > that would be helpful. > > -derek > > > > > On 06 Nov 2006 18:20:33 +0100, Peter Dalgaard =20 > <[EMAIL PROTECTED]> > wrote: >> >> [EMAIL PROTECTED] writes: >> >>> Full_Name: Derek Elmerick >>> Version: 2.4.0 >>> OS: Windows XP >>> Submission from: (NULL) (38.117.162.243) >>> >>> >>> >>> hello - >>> >>> i have some code that i run regularly using R version 2.3.x . the =20= >>> final >> step of >>> the code is to build a multinomial logit model. the dataset is =20 >>> large; >> however, i >>> have not had issues in the past. i just installed the 2.4.0 =20 >>> version of R >> and now >>> have memory allocation issues. to verify, i ran the code again =20 >>> against >> the 2.3 >>> version and no problems. since i have set the memory limit to the =20= >>> max >> size, i >>> have no alternative but to downgrade to the 2.3 version. thoughts? >> >> And what do you expect the maintainers to do about it? ( I.e. why are >> you filing a bug report.) >> >> You give absolutely no handle on what the cause of the problem might >> be, or even to reproduce it. It may be a bug, or maybe just R >> requiring more memory to run than previously. >> >> -- >> O__ ---- Peter Dalgaard =D8ster Farimagsgade 5, Entr.B >> c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K >> (*) \(*) -- University of Copenhagen Denmark Ph: (+45) >> 35327918 >> ~~~~~~~~~~ - ([EMAIL PROTECTED]) FAX: (+45) >> 35327907 >> > > [[alternative HTML version deleted]] > > ______________________________________________ > R-devel@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel ______________________________________________ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel