Thanks Peter and Andy;

I just found it was not due to memory problem. It was false alarm ...
64-bit compiled program works fine!


On 1/3/05 3:39 PM, "Peter Dalgaard" <[EMAIL PROTECTED]> wrote:

> Tae-Hoon Chung <[EMAIL PROTECTED]> writes:
> 
>> Happy new year to all;
>> 
>> A few days ago, I posted similar problem. At that time, I found out that our
>> R program had been 32-bit compiled, not 64-bit compiled. So the R program
>> has been re-installed in 64-bit and run the same job, reading in 150
>> Affymetrix U133A v2 CEL files and perform dChip processing. However, the
>> memory problem happened again. Since the amount of physical memory is 64GB,
>> I think it should not be a problem. Is there anyway we can configure memory
>> usage so that all physical memory can be utilized?
>> 
>> Our system is like this:
>> System type: IBM AIX Symmetric Multiprocessing (SMP)
>> OS version: SuSe 8 SP3a
>> CPU: 8
>> Memory: 64GB
> .....
>> expression values: liwong
>> normalizing...Error: cannot allocate vector of size 594075 Kb
>>> gc()
>>            used  (Mb) gc trigger   (Mb)
>> Ncells   797971  21.4    1710298   45.7
> 
> As Brian Ripley told you, 64-bit builds of R has 56byte Ncells, so if
> yours was one, you should have
> 
>> 797971*56/1024/1024
> [1] 42.61625
> 
> i.e. 42.6Mb used for your Ncells, and it seems that you don't....

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to