You just need the much smaller cross product matrix X'X and vector X'Y so you
can build those up as you read the data in in chunks.


On 4/24/06, Sachin J <[EMAIL PROTECTED]> wrote:
> Hi,
>
>  I have a dataset consisting of 350,000 rows and 266 columns.  Out of 266 
> columns 250 are dummy variable columns. I am trying to read this data set 
> into R dataframe object but unable to do it due to memory size limitations 
> (object size created is too large to handle in R).  Is there a way to handle 
> such a large dataset in R.
>
>  My PC has 1GB of RAM, and 55 GB harddisk space running windows XP.
>
>  Any pointers would be of great help.
>
>  TIA
>  Sachin
>
>
> ---------------------------------
>
>        [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
>

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to