Hi there,

I'm very new to R and am only in the beginning stages of investigating
it for possible use. A document by John Maindonald at the r-project
website entitled "Using R for Data Analysis and Graphics: Introduction,
Code and Commentary" contains the following paragraph, "The R system may
struggle to handle very large data sets. Depending on available computer
memory, the processing of a data set containing one hundred thousand
observations and perhaps twenty variables may press the limits of what R
can easily handle". This document was written in 2004.

My questions are:

Is this still the case? If so, has anyone come up with creative
solutions to mitigate these limitations? If you work with large data
sets in R, what have your experiences been?

>From what I've seen so far, R seems to have enormous potential and
capabilities. I routinely work with data sets of several hundred
thousand to several million. It would be unfortunate if such potential
and capabilities were not realized because of (effective) data set size
limitations.

Please tell me it ain't so.

Thanks for any help or suggestions.

Carl

 

        [[alternative HTML version deleted]]

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to