I believe it was already mentioned, but I can recommend the LaF package
(not completely impartial being the maintainer of LaF ;-)
However, the speed differences between packages will not be very large.
Eventually all packages will have to read in 6 GB of data and convert
the text data to
thank you all very much.
Kevin
On Sat, Apr 27, 2013 at 11:51 AM, Jan van der Laan rh...@eoos.dds.nlwrote:
I believe it was already mentioned, but I can recommend the LaF package
(not completely impartial being the maintainer of LaF ;-)
However, the speed differences between packages will
Hi all scientists,
Recently, I am dealing with big data ( 3G txt or csv format ) in my
desktop (windows 7 - 64 bit version), but I can not read them faster,
thought I search from internet. [define colClasses for read.table, cobycol
and limma packages I have use them, but it is not so fast].
Have you think of build a database then then let R read it thru that db
instead of your desktop?
On Fri, Apr 26, 2013 at 8:09 AM, Kevin Hao rfans4ch...@gmail.com wrote:
Hi all scientists,
Recently, I am dealing with big data ( 3G txt or csv format ) in my
desktop (windows 7 - 64 bit
Do you really have the need loading all the data into memory?
Mostly for large data set, people would just read a chunk of it for
developing analysis pipeline, and when that's done, the ready script would
just iterate through the entire data set. For example, the read.table
function has 'nrow'
On 04/26/2013 08:09 AM, Kevin Hao wrote:
Hi all scientists,
Recently, I am dealing with big data ( 3G txt or csv format ) in my
desktop (windows 7 - 64 bit version), but I can not read them faster,
thought I search from internet. [define colClasses for read.table, cobycol
and limma packages I
I can not think of sth better. Maybe try read part of the data that you
want to analyze, basically break the large data set into pieces.
On Fri, Apr 26, 2013 at 10:58 AM, Ye Lin ye...@lbl.gov wrote:
Have you think of build a database then then let R read it thru that db
instead of your
Thanks lcn,
I will try to read data from different chunks.
Best,
Kevin
On Fri, Apr 26, 2013 at 3:05 PM, lcn lcn...@gmail.com wrote:
Do you really have the need loading all the data into memory?
Mostly for large data set, people would just read a chunk of it for
developing analysis
Thanks.
I will try break into pieces to analysis.
Kevin
On Fri, Apr 26, 2013 at 4:38 PM, Ye Lin ye...@lbl.gov wrote:
I can not think of sth better. Maybe try read part of the data that you
want to analyze, basically break the large data set into pieces.
On Fri, Apr 26, 2013 at 10:58 AM,
Hi Ye,
Thanks.
That is a good method. have any other methods instead of using database?
kevin
On Fri, Apr 26, 2013 at 1:58 PM, Ye Lin ye...@lbl.gov wrote:
Have you think of build a database then then let R read it thru that db
instead of your desktop?
On Fri, Apr 26, 2013 at 8:09 AM,
...@r-project.org] On
Behalf Of Kevin Hao
Sent: Friday, April 26, 2013 12:53 PM
To: lcn
Cc: R help
Subject: Re: [R] Read big data (3G ) methods ?
Thanks lcn,
I will try to read data from different chunks.
Best,
Kevin
On Fri, Apr 26, 2013 at 3:05 PM, lcn lcn...@gmail.com wrote:
Do you really
On 13-04-26 3:00 PM, Kevin Hao wrote:
Hi Ye,
Thanks.
That is a good method. have any other methods instead of using database?
If you know the format of the file, you can probably write something in
C (or other language) that is faster than R. Convert your .csv file to
a nice binary
Thank you very much.
More and more methods are coming. That sounds great!
Thanks,
kevin
On Fri, Apr 26, 2013 at 7:51 PM, Duncan Murdoch murdoch.dun...@gmail.comwrote:
On 13-04-26 3:00 PM, Kevin Hao wrote:
Hi Ye,
Thanks.
That is a good method. have any other methods instead of using
13 matches
Mail list logo