I may be mistaken, but I believe R does all it work in memory. If that is so, you would really only have 2 options:
1. Get a lot of memory 2. Figure out a way to do the desired operation on parts of the data at a time. -Roy M. On Feb 27, 2008, at 9:03 PM, Jorge Iván Vélez wrote: > Dear R-list, > > Does somebody know how can I read a HUGE data set using R? It is a > hapmap > data set (txt format) which is around 4GB. After read it, I need to > delete > some specific rows and columns. I'm running R 2.6.2 patched over XP > SP2 > using a 2.4 GHz Core 2-Duo processor and 4GB RAM. Any suggestion > would be > appreciated. > > Thanks in advance, > > Jorge > > [[alternative HTML version deleted]] > > ______________________________________________ > R-help@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting- > guide.html > and provide commented, minimal, self-contained, reproducible code. ********************** "The contents of this message do not reflect any position of the U.S. Government or NOAA." ********************** Roy Mendelssohn Supervisory Operations Research Analyst NOAA/NMFS Environmental Research Division Southwest Fisheries Science Center 1352 Lighthouse Avenue Pacific Grove, CA 93950-2097 e-mail: [EMAIL PROTECTED] (Note new e-mail address) voice: (831)-648-9029 fax: (831)-648-8440 www: http://www.pfeg.noaa.gov/ "Old age and treachery will overcome youth and skill." ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.