Bad news!

The readLines command works fine upto a certain limit. Once a few files have
been written the R program crashes.

I used the following code:
*************************
iFile<-"Test.txt"
con <- file(iFile, "r")

N<-1250000; 
iLoop<-1
 
while(length(Lines <- readLines(con, n = N)) > 0 & iLoop<41) { 
oFile<-paste("Split_",iLoop,".txt",sep="")
  write.table(Lines, oFile, sep = "\t", quote = FALSE, col.names= FALSE,
row.names = FALSE)
  iLoop<-iLoop+1
} 
close(con)
********************

With above N=1.25 million, it wrote 28 files of about 57 mb each. That is a
total of about 1.6 GB and then crashed.
I tried with other values on N and it crashes at about the same place in
terms of total size output, i.e., about 1.6 GB.

Is this due to any limitation of Windows 7, in terms of not having the
pointer after this size?

Your insight would be very helpful.

Thank you.
HC






--
View this message in context: 
http://r.789695.n4.nabble.com/sqldf-for-Very-Large-Tab-Delimited-Files-tp4350555p4355679.html
Sent from the R help mailing list archive at Nabble.com.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to