You might be interested in the speed wars that are happening in the
file reading/writing space currently.

Matt Dowle/Arun Srinivasan's data.table and Hadley Wickham/Wes
McKinney's Feather have made huge speed advances in reading/writing
large datasets from disks (mostly csv).

Data Table fread()/fwrite():
https://github.com/Rdatatable/data.table
https://stackoverflow.com/questions/35763574/fastest-way-to-read-in-100-000-dat-gz-files
http://blog.h2o.ai/2016/04/fast-csv-writing-for-r/


Feather read_feather()/write_feather()
https://github.com/wesm/feather

I don't often have big datasets (10s of MBs) so I don't see the
benefits of these much, but you might.

HTH,
B

On Thu, May 5, 2016 at 3:16 PM, Charles DiMaggio
<charles.dimag...@gmail.com> wrote:
> Been a while, but wanted to close the page on a previous post describing R 
> hanging on readRDS() and load() for largish (say 500MB or larger) files. 
> Tried again with recent release (3.3.0).  Am able to read in large files 
> under El Cap.  While the file is reading in, I get a disconcerting spinning 
> pinwheel of death and a check under Force Quit reports R is not responding.  
> But if I wait it out, it eventually reads in.  Odd.  But I can live with it.
>
> Cheers
>
> Charles
>
>
>
>
>
>
> Charles DiMaggio, PhD, MPH
> Professor of Surgery and Population Health
> Director of Injury Research
> Department of Surgery
> New York University School of Medicine
> 462 First Avenue, NBV 15
> New York, NY 10016-9196
> charles.dimag...@nyumc.org
> Office: 212.263.3202
> Mobile: 516.308.6426
>
>
>
>
>
>
>         [[alternative HTML version deleted]]
>
> _______________________________________________
> R-SIG-Mac mailing list
> R-SIG-Mac@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-mac

_______________________________________________
R-SIG-Mac mailing list
R-SIG-Mac@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-mac

Reply via email to