----------------------------------------
> From: dwinsem...@comcast.net
> To: julio.flo...@spss.com.mx
> Date: Thu, 19 May 2011 10:40:08 -0400
> CC: r-help@r-project.org
> Subject: Re: [R] Help, please
>
>
> On May 18, 2011, at 6:29 PM, Julio César Flores Castro wrote:
>
> > Hi,
> >
> > I am using R 2.10.1 and I have a doubt. Do you know how many cases
> > can R
> > handle?
>
> I was able to handle (meaning do Cox proportional hazards work with
> the 'rms' package which adds extra memory overhead with a datadist
> object) a 5.5 million rows by 100 columns dataframe without
> difficulty using 24 GB on a Mac (BSD UNIX kernel). I was running into
> performance slow downs related to paging out to virtual memory at 150
> columns, but after expanding to 32 GB can now handle 5.5 MM records
> with 200 columns without paging.
>
> >
> > I want to use the library npmc but if I have more than 4,500 cases I
> > get an
> > error message. If I use less than 4500 cases I don´t have problems
> > with this
> > library.
> >
> > Is there any way to increase the number of cases in order to use this
> > library.
>
> 64 bit OS, 64 bit R, and more memory.
>
The longer term solution is implementation and algorithm designed to
increase coherence
of memory accesses ( firefox is doing this to me now dropping every few chars 
and
 getting 
many behind as it thrashes with memory leak, LOL).




                                          
______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to