Hello all,

I’m trying to take a huge dataset (1.5 GB) and separate it into smaller chunks with R.

So far I had nothing but problems.

I cannot load the whole dataset in R due to memory problems. So, I instead try to load a few (100000) lines at a time (with read.table).

However, R kept crashing (with no error message) at about the 6800000 line. This is extremely frustrating.

To try to fix this, I used connections with read.table. However, I now get a cryptic error telling me “no lines available in input”.

Is there any way to make this work?

Best,
Guillaume

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to