Alan Cohen wrote:
Hello,

I have several very large data sets (1-7 million observations, sometimes 
hundreds of variables) that I'm trying to work with in R, and memory seems to 
be a big issue.  I'm currently using a 2 GB Windows setup, but might have the 
option to run R on a server remotely.  Windows R seems basically limited to 2 
GB memory if I'm right; is there the possibility to go much beyond that with 
server-based R?  In other words, am I limited by R or by my hardware, and how 
much might R be able to handle if I get the hardware necessary?

Hi,

R under Windows can handle up to 2GB, and under certain conditions up to 3GB. Note that no 64-bit version for Windows is available yet.

Anyway, under 64-bit Linux their are almost no limits other than hardware restrictions.

Best wishes,
Uwe


Also, any possibility of using web-based R for this kind of thing?



Cheers,
Alan Cohen

Alan Cohen
Post-doctoral Fellow
Centre for Global Health Research
70 Richmond St. East, Suite 202A
Toronto, ON M5C 1N8
Canada
(416) 854-3121 (cell)
(416) 864-6060 ext. 3156 (0ffice)

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to