Folks:
Suppose I divide USA into 16 regions. My end goal is to run data mining /
analysis on each of these 16 regions. The data for each of these regions
(sales, forecast, etc.) will be in the range of 10-20 GB. At one time, I
will need to load say 15 GB into R and then do analysis.

Is this something other R users are doing? Or, is it better to switch to
SAS? Could you help me with any information on this? Thanks.
Satish


-- 
View this message in context: 
http://n4.nabble.com/Reading-large-files-tp1469691p1469700.html
Sent from the R help mailing list archive at Nabble.com.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to