JENNIFER HILL <jh1030 <at> columbia.edu> writes:

> 
> 
> Hi, I need to analyze data that has 3.5 million observations and
> about 60 variables and I was planning on using R to do this but
> I can't even seem to read in the data.  It just freezes and ties
> up the whole system -- and this is on a Linux box purchased about
> 6 months ago on a dual-processor PC that was pretty much the top
> of the line.  I've tried expanding R the memory limits but it 
> doesn't help.  I'll be hugely disappointed if I can't use R b/c
> I need to do build tailor-made models (multilevel and other 
> complexities).   My fall-back is the SPlus big data package but
> I'd rather avoid if anyone can provide a solution....
> 
> Thanks!!!!
> 
> Jennifer Hill
> 
Dear Jennifer, you may want to look at the R newsletters. A few years ago it had
an article on using DBMS with R, like MySQL, Oracle, etc. This is a frequently
asked question: There are also some posts over the past few years that may be
helpful. I have successfully read large database into MySQL, and accessed it
from R---it was larger than your database. I hope that helps. Anupam Tyagi.

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to