RE: [R] Absolute ceiling on R's memory usage = 4 gigabytes?

2004-07-13 Thread laurent buffat
Hi Tae-Hoon,

I am very surprise by your answers : 

When I try to make an affybatch with bioconductor and R 1.9.1, I was unable
to read and normalise more than 80 HU-133A CEL file with a Linux 32 bits
computer and 4 GB of RAM + 8 GB of swap (Of course, without any other
process on the computer and I don't' want do to "JustRMA" because I want the
probe level information in the affybatch, And it's not a limit in R
configuration, because if I follow the memory usage during the R session, R
is using all the 4GB RAM memory (the swap is not use) before the memory
error) 

For this raison we are planning to buy a 64 bits under Linux, but, if with
Mac OS X and 1.5 GB of RAM, we can solve this problem, I will buy a Mac and
not a linux 64 bits computer.

So, what kind of normalization are you doing ? Some one with bioconductor
and the affy package or an other one ? Could you precise ?

For the other R & BioC :

Do you think that there is a difference between linux and MacOS for the
memory management under R ?

What is a "good" hardward solution for "R / Linux 64 bits" ?

Thanks for your help.

laurent


-Message d'origine-
De : [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] De la part de Tae-Hoon Chung
Envoye : vendredi 2 juillet 2004 01:52
A : Kort, Eric
Cc : [EMAIL PROTECTED]
Objet : Re: [R] Absolute ceiling on R's memory usage = 4 gigabytes?

Hi, Eric.
It seems a little bit puzzling to me. Which Affymetrix chip do you use? 
The reason I'm asking this is that yesterday I was able to normalize 
150 HU-133A CEL files (containing 22283 probes) using R 1.9.1 in Mac OS 
X 10.3.3 with 1.5 GB memory. If your chip has more probes than this, 
then it must be understandable ...

On Jul 1, 2004, at 2:59 PM, Kort, Eric wrote:

> Hello.  By way of background, I am running out of memory when 
> attempting to normalize the data from 160 affymetrix microarrays using 
> justRMA (from the affy package).  This is despite making 6 gigabytes 
> of swap space available on our sgi irix machine (which has 2 gigabytes 
> of ram).  I have seen in various discussions statements such as "you 
> will need at least 6 gigabytes of memory to normalize that many 
> chips", but my question is this:
>
> I cannot set the memory limits of R (1.9.1) higher than 4 gigabytes as 
> attempting to do so results in this message:
>
> WARNING: --max-vsize=4098M=4098`M': too large and ignored
>
> I experience this both on my windows box (on which I cannot allocate 
> more than 4 gigabytes of swap space anyway), and on an the above 
> mentioned sgi irix machine (on which I can).  In view of that, I do 
> not see what good it does to make > 4 gigabytes of ram+swap space 
> available.  Does this mean 4 gigabytes is the absolute upper limit of 
> R's memory usage...or perhaps 8 gigabytes since you can set both the 
> stack and the heap size to 4 gigabytes?
>
> Thanks,
> Eric
>
>
> This email message, including any attachments, is for the 
> so...{{dropped}}
>
> __
> [EMAIL PROTECTED] mailing list
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! 
> http://www.R-project.org/posting-guide.html
>
>
Tae-Hoon Chung, Ph.D

Post-doctoral Research Fellow
Molecular Diagnostics and Target Validation Division
Translational Genomics Research Institute
1275 W Washington St, Tempe AZ 85281 USA
Phone: 602-343-8724

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide!
http://www.R-project.org/posting-guide.html

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


RE: [R] Absolute ceiling on R's memory usage = 4 gigabytes?

2004-07-02 Thread Kort, Eric
Yes...unfortunately it looks like the lab that owns the irix does not have a license 
for Sun's compilers which are required for compiling a 64bit R (I used gcc to compile 
the 32 bit version, but have not had success compiling a 64 bit R with gcc on the 
irix).  But I am sure we will manage to compile a 64 bit R somewhere sometime soon.
 
Thanks,
Eric

-Original Message- 
From: Paul Gilbert [mailto:[EMAIL PROTECTED] 
Sent: Fri 7/2/2004 10:39 AM 
To: Kort, Eric 
Cc: Tae-Hoon Chung; [EMAIL PROTECTED] 
Subject: Re: [R] Absolute ceiling on R's memory usage = 4 gigabytes?



It looks like you have R compiled as a 32 bit application, and you will
need to compile it as a 64 bit application if you want to address more
than 4G memory. I am not familiar with the sgi irix machine, but you can
do this on many workstations that have processors with a 64 bit
architecture and an OS that supports it.  The R-admin notes have some
hints about how to do this for various platforms.

Paul Gilbert

Kort, Eric wrote:

>Yes, we are using the HGU-133plus2 chips with 50,000+ probes, and I suppose 
that the memory requirements increase geometrically as the chip size increases.
>
>Thanks for your email...I can let you know if we have any success if you are 
interested for future reference.
>
>-Eric
>
>   -Original Message-
>   From: Tae-Hoon Chung [mailto:[EMAIL PROTECTED]
>   Sent: Thu 7/1/2004 7:52 PM
>   To: Kort, Eric
>   Cc: [EMAIL PROTECTED]
    >   Subject: Re: [R] Absolute ceiling on R's memory usage = 4 gigabytes?
>  
>  
>
>   Hi, Eric.
>   It seems a little bit puzzling to me. Which Affymetrix chip do you use?
>   The reason I'm asking this is that yesterday I was able to normalize
>   150 HU-133A CEL files (containing 22283 probes) using R 1.9.1 in Mac OS
>   X 10.3.3 with 1.5 GB memory. If your chip has more probes than this,
>   then it must be understandable ...
>  
>   On Jul 1, 2004, at 2:59 PM, Kort, Eric wrote:
>  
>   > Hello.  By way of background, I am running out of memory when
>   > attempting to normalize the data from 160 affymetrix microarrays 
using
>   > justRMA (from the affy package).  This is despite making 6 gigabytes
>   > of swap space available on our sgi irix machine (which has 2 
gigabytes
>   > of ram).  I have seen in various discussions statements such as "you
>   > will need at least 6 gigabytes of memory to normalize that many
>   > chips", but my question is this:
>   >
>   > I cannot set the memory limits of R (1.9.1) higher than 4 gigabytes 
as
>   > attempting to do so results in this message:
>   >
>   > WARNING: --max-vsize=4098M=4098`M': too large and ignored
>   >
>   > I experience this both on my windows box (on which I cannot allocate
>   > more than 4 gigabytes of swap space anyway), and on an the above
>   > mentioned sgi irix machine (on which I can).  In view of that, I do
>   > not see what good it does to make > 4 gigabytes of ram+swap space
>   > available.  Does this mean 4 gigabytes is the absolute upper limit of
>   > R's memory usage...or perhaps 8 gigabytes since you can set both the
>   > stack and the heap size to 4 gigabytes?
>   >
>   > Thanks,
>   > Eric
>   >
>   >
>   > This email message, including any attachments, is for the
>   > so...{{dropped}}
>   >
>   > __
>   > [EMAIL PROTECTED] mailing list
>   > https://www.stat.math.ethz.ch/mailman/listinfo/r-help
>   > PLEASE do read the posting guide!
>   > http://www.R-project.org/posting-guide.html
>   >
>   >
>   Tae-Hoon Chung, Ph.D
>  
>   Post-doctoral Research Fellow
>   Molecular Diagnostics and Target Validation Division
>   Translational Genomics Research Institute
>   1275 W Washington St, Tempe AZ 85281 USA
>

Re: [R] Absolute ceiling on R's memory usage = 4 gigabytes?

2004-07-02 Thread Paul Gilbert
It looks like you have R compiled as a 32 bit application, and you will 
need to compile it as a 64 bit application if you want to address more 
than 4G memory. I am not familiar with the sgi irix machine, but you can 
do this on many workstations that have processors with a 64 bit 
architecture and an OS that supports it.  The R-admin notes have some 
hints about how to do this for various platforms.

Paul Gilbert
Kort, Eric wrote:
Yes, we are using the HGU-133plus2 chips with 50,000+ probes, and I suppose that the 
memory requirements increase geometrically as the chip size increases.
Thanks for your email...I can let you know if we have any success if you are 
interested for future reference.
-Eric
	-Original Message- 
	From: Tae-Hoon Chung [mailto:[EMAIL PROTECTED] 
	Sent: Thu 7/1/2004 7:52 PM 
	To: Kort, Eric 
	Cc: [EMAIL PROTECTED] 
	Subject: Re: [R] Absolute ceiling on R's memory usage = 4 gigabytes?
	
	

Hi, Eric.
It seems a little bit puzzling to me. Which Affymetrix chip do you use?
The reason I'm asking this is that yesterday I was able to normalize
150 HU-133A CEL files (containing 22283 probes) using R 1.9.1 in Mac OS
X 10.3.3 with 1.5 GB memory. If your chip has more probes than this,
then it must be understandable ...

On Jul 1, 2004, at 2:59 PM, Kort, Eric wrote:

> Hello.  By way of background, I am running out of memory when
> attempting to normalize the data from 160 affymetrix microarrays using
> justRMA (from the affy package).  This is despite making 6 gigabytes
> of swap space available on our sgi irix machine (which has 2 gigabytes
> of ram).  I have seen in various discussions statements such as "you
> will need at least 6 gigabytes of memory to normalize that many
> chips", but my question is this:
>
> I cannot set the memory limits of R (1.9.1) higher than 4 gigabytes as
> attempting to do so results in this message:
>
> WARNING: --max-vsize=4098M=4098`M': too large and ignored
>
> I experience this both on my windows box (on which I cannot allocate
> more than 4 gigabytes of swap space anyway), and on an the above
> mentioned sgi irix machine (on which I can).  In view of that, I do
> not see what good it does to make > 4 gigabytes of ram+swap space
> available.  Does this mean 4 gigabytes is the absolute upper limit of
> R's memory usage...or perhaps 8 gigabytes since you can set both the
> stack and the heap size to 4 gigabytes?
>
> Thanks,
> Eric
>
>
> This email message, including any attachments, is for the
> so...{{dropped}}
>
> __
> [EMAIL PROTECTED] mailing list
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide!
> http://www.R-project.org/posting-guide.html
>
>
Tae-Hoon Chung, Ph.D

Post-doctoral Research Fellow
Molecular Diagnostics and Target Validation Division
Translational Genomics Research Institute
1275 W Washington St, Tempe AZ 85281 USA
Phone: 602-343-8724


This email message, including any attachments, is for the so...{{dropped}}
__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
 

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


RE: [R] Absolute ceiling on R's memory usage = 4 gigabytes?

2004-07-02 Thread Kort, Eric
Yes, we are using the HGU-133plus2 chips with 50,000+ probes, and I suppose that the 
memory requirements increase geometrically as the chip size increases.
 
Thanks for your email...I can let you know if we have any success if you are 
interested for future reference.
 
-Eric

-Original Message- 
From: Tae-Hoon Chung [mailto:[EMAIL PROTECTED] 
Sent: Thu 7/1/2004 7:52 PM 
To: Kort, Eric 
Cc: [EMAIL PROTECTED] 
Subject: Re: [R] Absolute ceiling on R's memory usage = 4 gigabytes?



Hi, Eric.
It seems a little bit puzzling to me. Which Affymetrix chip do you use?
The reason I'm asking this is that yesterday I was able to normalize
150 HU-133A CEL files (containing 22283 probes) using R 1.9.1 in Mac OS
X 10.3.3 with 1.5 GB memory. If your chip has more probes than this,
then it must be understandable ...

On Jul 1, 2004, at 2:59 PM, Kort, Eric wrote:

> Hello.  By way of background, I am running out of memory when
> attempting to normalize the data from 160 affymetrix microarrays using
> justRMA (from the affy package).  This is despite making 6 gigabytes
> of swap space available on our sgi irix machine (which has 2 gigabytes
> of ram).  I have seen in various discussions statements such as "you
> will need at least 6 gigabytes of memory to normalize that many
> chips", but my question is this:
>
> I cannot set the memory limits of R (1.9.1) higher than 4 gigabytes as
> attempting to do so results in this message:
>
> WARNING: --max-vsize=4098M=4098`M': too large and ignored
>
> I experience this both on my windows box (on which I cannot allocate
> more than 4 gigabytes of swap space anyway), and on an the above
> mentioned sgi irix machine (on which I can).  In view of that, I do
> not see what good it does to make > 4 gigabytes of ram+swap space
> available.  Does this mean 4 gigabytes is the absolute upper limit of
> R's memory usage...or perhaps 8 gigabytes since you can set both the
> stack and the heap size to 4 gigabytes?
>
> Thanks,
> Eric
>
>
> This email message, including any attachments, is for the
> so...{{dropped}}
>
> __
> [EMAIL PROTECTED] mailing list
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide!
> http://www.R-project.org/posting-guide.html
>
>
Tae-Hoon Chung, Ph.D

Post-doctoral Research Fellow
Molecular Diagnostics and Target Validation Division
Translational Genomics Research Institute
1275 W Washington St, Tempe AZ 85281 USA
Phone: 602-343-8724




This email message, including any attachments, is for the so...{{dropped}}

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] Absolute ceiling on R's memory usage = 4 gigabytes?

2004-07-01 Thread Tae-Hoon Chung
Hi, Eric.
It seems a little bit puzzling to me. Which Affymetrix chip do you use? 
The reason I'm asking this is that yesterday I was able to normalize 
150 HU-133A CEL files (containing 22283 probes) using R 1.9.1 in Mac OS 
X 10.3.3 with 1.5 GB memory. If your chip has more probes than this, 
then it must be understandable ...

On Jul 1, 2004, at 2:59 PM, Kort, Eric wrote:
Hello.  By way of background, I am running out of memory when 
attempting to normalize the data from 160 affymetrix microarrays using 
justRMA (from the affy package).  This is despite making 6 gigabytes 
of swap space available on our sgi irix machine (which has 2 gigabytes 
of ram).  I have seen in various discussions statements such as "you 
will need at least 6 gigabytes of memory to normalize that many 
chips", but my question is this:

I cannot set the memory limits of R (1.9.1) higher than 4 gigabytes as 
attempting to do so results in this message:

WARNING: --max-vsize=4098M=4098`M': too large and ignored
I experience this both on my windows box (on which I cannot allocate 
more than 4 gigabytes of swap space anyway), and on an the above 
mentioned sgi irix machine (on which I can).  In view of that, I do 
not see what good it does to make > 4 gigabytes of ram+swap space 
available.  Does this mean 4 gigabytes is the absolute upper limit of 
R's memory usage...or perhaps 8 gigabytes since you can set both the 
stack and the heap size to 4 gigabytes?

Thanks,
Eric
This email message, including any attachments, is for the 
so...{{dropped}}

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! 
http://www.R-project.org/posting-guide.html


Tae-Hoon Chung, Ph.D
Post-doctoral Research Fellow
Molecular Diagnostics and Target Validation Division
Translational Genomics Research Institute
1275 W Washington St, Tempe AZ 85281 USA
Phone: 602-343-8724
__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


RE: [R] Absolute ceiling on R's memory usage = 4 gigabytes?

2004-07-01 Thread Kort, Eric


>From: Liaw, Andy [mailto:[EMAIL PROTECTED]
>
>Did you compile R as 64-bit executable on the Irix?  If not, R will be
>subjected to the 4GB limit of 32-bit systems.
>

No...

>Search the archive for `Opteron' and you'll see that the limit is not 4GB,
>for 64-bit executables.
>
>Andy

Excellent.  I will recompile and try again.

Thanks,
Eric

>> From:  Kort, Eric
>> 
>> Hello.  By way of background, I am running out of memory when 
>> attempting to normalize the data from 160 affymetrix 
>> microarrays using justRMA (from the affy package).  This is 
>> despite making 6 gigabytes of swap space available on our sgi 
>> irix machine (which has 2 gigabytes of ram).  I have seen in 
>> various discussions statements such as "you will need at 
>> least 6 gigabytes of memory to normalize that many chips", 
>> but my question is this:
>> 
>> I cannot set the memory limits of R (1.9.1) higher than 4 
>> gigabytes as attempting to do so results in this message:
>> 
>> WARNING: --max-vsize=4098M=4098`M': too large and ignored
>> 
>> I experience this both on my windows box (on which I cannot 
>> allocate more than 4 gigabytes of swap space anyway), and on 
>> an the above mentioned sgi irix machine (on which I can).  In 
>> view of that, I do not see what good it does to make > 4 
>> gigabytes of ram+swap space available.  Does this mean 4 
>> gigabytes is the absolute upper limit of R's memory 
>> usage...or perhaps 8 gigabytes since you can set both the 
>> stack and the heap size to 4 gigabytes?
>> 
>> Thanks,
>> Eric
>> 
>> 
This email message, including any attachments, is for the so...{{dropped}}

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


RE: [R] Absolute ceiling on R's memory usage = 4 gigabytes?

2004-07-01 Thread Liaw, Andy
Did you compile R as 64-bit executable on the Irix?  If not, R will be
subjected to the 4GB limit of 32-bit systems.

Search the archive for `Opteron' and you'll see that the limit is not 4GB,
for 64-bit executables.

Andy

> From:  Kort, Eric
> 
> Hello.  By way of background, I am running out of memory when 
> attempting to normalize the data from 160 affymetrix 
> microarrays using justRMA (from the affy package).  This is 
> despite making 6 gigabytes of swap space available on our sgi 
> irix machine (which has 2 gigabytes of ram).  I have seen in 
> various discussions statements such as "you will need at 
> least 6 gigabytes of memory to normalize that many chips", 
> but my question is this:
> 
> I cannot set the memory limits of R (1.9.1) higher than 4 
> gigabytes as attempting to do so results in this message:
> 
> WARNING: --max-vsize=4098M=4098`M': too large and ignored
> 
> I experience this both on my windows box (on which I cannot 
> allocate more than 4 gigabytes of swap space anyway), and on 
> an the above mentioned sgi irix machine (on which I can).  In 
> view of that, I do not see what good it does to make > 4 
> gigabytes of ram+swap space available.  Does this mean 4 
> gigabytes is the absolute upper limit of R's memory 
> usage...or perhaps 8 gigabytes since you can set both the 
> stack and the heap size to 4 gigabytes?
> 
> Thanks,
> Eric
> 
> 
> This email message, including any attachments, is for the 
> so...{{dropped}}
> 
> __
> [EMAIL PROTECTED] mailing list
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! 
> http://www.R-project.org/posting-guide.html
> 
>

__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html