Hi, I need to analyze data that has 3.5 million observations and
about 60 variables and I was planning on using R to do this but
I can't even seem to read in the data.  It just freezes and ties
up the whole system -- and this is on a Linux box purchased about
6 months ago on a dual-processor PC that was pretty much the top
of the line.  I've tried expanding R the memory limits but it 
doesn't help.  I'll be hugely disappointed if I can't use R b/c
I need to do build tailor-made models (multilevel and other 
complexities).   My fall-back is the SPlus big data package but
I'd rather avoid if anyone can provide a solution....

Thanks!!!!

Jennifer Hill

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to