Hi All,
I am trying to assemble a system that will allow me to work with large
datasets (45-50 million rows, 300-400 columns) possibly amounting to
10GB + in size.

I am aware that R 64 bit implementations on Linux boxes are suitable
for such an exercise but I am looking for configurations that R users
out there may have used in creating a high-end R system.
Due to a lot of apprehensions that SAS users have about R's data
limitations, I want to demonstrate R's usability even with very large
datasets as mentioned above.
I would be glad to hear from users(share configurations and system
specific information) who have desktops/servers on which they use R to
crunch massive datasets.


Any suggestions in expanding R's functionality in the face of gigabyte
class datasets would be appreciated.

Thanks
Harsh Singhal
Decision Systems,
Mu Sigma Inc.
Chicago, IL

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to