You might consider processing this is pieces. If I assume that this is
numeric data, then 'after' conversion this will take up 256MB of RAM which
is 1/2 your real memory. This means that you will probably not be able to do
any processing of the data since you will probably have to make copies of
parts of the object at various times. It also assumes that you have all the
512MB really available to your program; there is the operating system and
other applications running.
Do you have to have all the data in memory at one time? If so, consider
getting a 64-bit version of R and at least 2GB of RAM to be on the safe
side. You could also put the data in a database and then read in only the
columns that are needed.
On 8/11/06, T Mu [EMAIL PROTECTED] wrote:
I was trying to read a large .csv file (80 colums, 400,000 rows, size of
about 200MB). I used scan(), R 2.3.1 on Windows XP. My computer is AMD
2000+
and has 512MB ram.
It sometimes freezes my PC, sometimes just shuts down R quitely.
Is there a way (option, function) to better handle large files?
Seemingly SAS can deal with it with no problem, but I just persuaded my
professor transfering to R, so it is quite disappointing.
Please help, thank you.
[[alternative HTML version deleted]]
__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
--
Jim Holtman
Cincinnati, OH
+1 513 646 9390
What is the problem you are trying to solve?
[[alternative HTML version deleted]]
__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.