Hi, after testing
R) system.time(read.csv(myfile.csv))
user system elapsed
1.126 0.038 1.177
R) system.time(read.csv.sql(myfile.csv))
user system elapsed
1.405 0.025 1.439
Warning messages:
1: closing unused connection 4 ()
2: closing unused connection 3 ()
It seems that
To speed things up, you certainly want to give R more clues about your
data files by being more explicit by many of the arguments (cf.
help(read.table), especially specifying argument 'colClasses' makes a
big difference.
/Henrik
On Tue, Sep 28, 2010 at 10:24 AM, statquant2 statqu...@gmail.com
On 29/09/2010 6:24 a.m., statquant2 wrote:
Hi, after testing
R) system.time(read.csv(myfile.csv))
user system elapsed
1.126 0.038 1.177
R) system.time(read.csv.sql(myfile.csv))
user system elapsed
1.405 0.025 1.439
Warning messages:
1: closing unused connection 4 ()
2:
On Tue, Sep 28, 2010 at 1:24 PM, statquant2 statqu...@gmail.com wrote:
Hi, after testing
R) system.time(read.csv(myfile.csv))
user system elapsed
1.126 0.038 1.177
R) system.time(read.csv.sql(myfile.csv))
user system elapsed
1.405 0.025 1.439
Warning messages:
1: closing
Hello all,
the test I provided was just to pinpoint that for loading once a big csv
file with read.csv was quicker than read.csv.sql... I have already
optimized my calls to read.csv for my particular problem, but is a simple
call to read.csv was quicker than read.csv.sql I doubt that specifying
On Tue, Sep 28, 2010 at 5:02 PM, statquant2 statqu...@gmail.com wrote:
Hello all,
the test I provided was just to pinpoint that for loading once a big csv
A file that can be read in under 2 seconds is not big.
file with read.csv was quicker than read.csv.sql... I have already
optimized my
thank you very much for this sql package, the thing is that thoses table I
read are loaded into memory once and for all, and then we work with the
data.frames...
Do you think then that this is going to be quicker (as I would have thougth
that building the SQL DB from the flat file would already
On Mon, Sep 27, 2010 at 7:49 AM, statquant2 statqu...@gmail.com wrote:
thank you very much for this sql package, the thing is that thoses table I
read are loaded into memory once and for all, and then we work with the
data.frames...
Do you think then that this is going to be quicker (as I
Hello everyone,
I currently run R code that have to read 100 or more large csv files (= 100
Mo), and usually write csv too.
My collegues and I like R very much but are a little bit ashtonished by how
slow those functions are. We have looked on every argument of those
functions and if specifying
On Sun, Sep 26, 2010 at 8:38 AM, statquant2 statqu...@gmail.com wrote:
Hello everyone,
I currently run R code that have to read 100 or more large csv files (= 100
Mo), and usually write csv too.
My collegues and I like R very much but are a little bit ashtonished by how
slow those functions
On 26.09.2010 14:38, statquant2 wrote:
Hello everyone,
I currently run R code that have to read 100 or more large csv files (= 100
Mo), and usually write csv too.
My collegues and I like R very much but are a little bit ashtonished by how
slow those functions are. We have looked on every
11 matches
Mail list logo