Quick question, what the memory size in R?
I converted to CSV, but only 53300 of the 1,000,000 rows were read in.  Did
R run out of memory?  If so, is there a work around?

Thanks,
Mike

On Fri, Jul 6, 2012 at 1:24 PM, Duncan Murdoch <murdoch.dun...@gmail.com>wrote:

> On 06/07/2012 1:11 PM, C W wrote:
>
>> Hi all
>> I have a large SAS data set, how do I get it read in R?
>>
>> The data is too big (about 400,000 rows by 100 columns) to be saved as an
>> Excel file.  How should I get it read in R?  Any packages?  I don't seem
>> to
>> find any.
>>
>
> You could write it out in some plain delimited format, e.g. CSV or
> tab-delimited.  Watch out for special characters in strings that confuse R
> when it reads it in (e.g. commas in unquoted CSV strings, quotes within
> strings, etc.)
>
> Duncan Murdoch
>

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to