On 26.09.2010 14:38, statquant2 wrote:

Hello everyone,
I currently run R code that have to read 100 or more large csv files (>= 100
Mo), and usually write csv too.
My collegues and I like R very much but are a little bit ashtonished by how
slow those functions are. We have looked on every argument of those
functions and if specifying some parameters help a bit, this is still too
slow.
I am sure a lot of people have the same problem so I thought one of you
would know a trick or a package that would help speeding this up a lot.

(we work on LINUX Red Hat R 2.10.0 but I guess this is of no use for this
pb)

Thanks for reading this.
Have a nice week end


Most of us read the csv file and write an Rdata file at once (see ?save). Then we can read in the data much quicker after they have been imported once with read.csv and friends.

Uwe Ligges

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to