hello,

for my present project i need to use the data stored in a ca. 100mb 
stata dataset. 

when i import the data in R using:

library("foreign")
x<-read.dta("mydata.dta")

i find that R needs a startling 665mb of memory!

(in stata i can simply allocate, say, 128mb of memory and go ahead)

is there anyway around this, or should i forget R for analysis of 
datasets of this magnitude?

thanks for you help in this, edwin.

______________________________________________
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to