Thank you all! I got very useful advice and I saw some nice solutions that even made me drool! ;-)
Cheers!! Albert-Jan ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ In the face of ambiguity, refuse the temptation to guess. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ --- On Wed, 12/16/09, Søren Højsgaard <[email protected]> wrote: From: Søren Højsgaard <[email protected]> Subject: SV: [R] R & very large files To: "Albert-Jan Roskam" <[email protected]>, "[email protected]" <[email protected]> Date: Wednesday, December 16, 2009, 12:13 PM The sqldf package may be of help to you. Regards Søren -----Oprindelig meddelelse----- Fra: [email protected] [mailto:[email protected]] På vegne af Albert-Jan Roskam Sendt: 16. december 2009 11:59 Til: [email protected] Emne: [R] R & very large files Hi, I very recently started using R (as in: last week) and I was wondering if anyone could point me to website(s) with sample code to deal with large datasets (length- and/or breadthwise). I understood that R was never designed to work with datasets larger than, say, a couple of hundred Mb. One way is (as I also read) to let R work in conjunction with SQL. That's one interesting approach I'd like to know more about. But I was also hoping that there also were pure R solutions for working with very large tables (was 'scan' designed for that?). In any case, a standard approach would be desirable. Thanks in advance. Cheers!! Albert-Jan ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ In the face of ambiguity, refuse the temptation to guess. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ [[alternative HTML version deleted]] [[alternative HTML version deleted]]
______________________________________________ [email protected] mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.

