Yes, we are using the HGU-133plus2 chips with 50,000+ probes, and I suppose that the 
memory requirements increase geometrically as the chip size increases.
 
Thanks for your email...I can let you know if we have any success if you are 
interested for future reference.
 
-Eric

        -----Original Message----- 
        From: Tae-Hoon Chung [mailto:[EMAIL PROTECTED] 
        Sent: Thu 7/1/2004 7:52 PM 
        To: Kort, Eric 
        Cc: [EMAIL PROTECTED] 
        Subject: Re: [R] Absolute ceiling on R's memory usage = 4 gigabytes?
        
        

        Hi, Eric.
        It seems a little bit puzzling to me. Which Affymetrix chip do you use?
        The reason I'm asking this is that yesterday I was able to normalize
        150 HU-133A CEL files (containing 22283 probes) using R 1.9.1 in Mac OS
        X 10.3.3 with 1.5 GB memory. If your chip has more probes than this,
        then it must be understandable ...
        
        On Jul 1, 2004, at 2:59 PM, Kort, Eric wrote:
        
        > Hello.  By way of background, I am running out of memory when
        > attempting to normalize the data from 160 affymetrix microarrays using
        > justRMA (from the affy package).  This is despite making 6 gigabytes
        > of swap space available on our sgi irix machine (which has 2 gigabytes
        > of ram).  I have seen in various discussions statements such as "you
        > will need at least 6 gigabytes of memory to normalize that many
        > chips", but my question is this:
        >
        > I cannot set the memory limits of R (1.9.1) higher than 4 gigabytes as
        > attempting to do so results in this message:
        >
        > WARNING: --max-vsize=4098M=4098`M': too large and ignored
        >
        > I experience this both on my windows box (on which I cannot allocate
        > more than 4 gigabytes of swap space anyway), and on an the above
        > mentioned sgi irix machine (on which I can).  In view of that, I do
        > not see what good it does to make > 4 gigabytes of ram+swap space
        > available.  Does this mean 4 gigabytes is the absolute upper limit of
        > R's memory usage...or perhaps 8 gigabytes since you can set both the
        > stack and the heap size to 4 gigabytes?
        >
        > Thanks,
        > Eric
        >
        >
        > This email message, including any attachments, is for the
        > so...{{dropped}}
        >
        > ______________________________________________
        > [EMAIL PROTECTED] mailing list
        > https://www.stat.math.ethz.ch/mailman/listinfo/r-help
        > PLEASE do read the posting guide!
        > http://www.R-project.org/posting-guide.html
        >
        >
        Tae-Hoon Chung, Ph.D
        
        Post-doctoral Research Fellow
        Molecular Diagnostics and Target Validation Division
        Translational Genomics Research Institute
        1275 W Washington St, Tempe AZ 85281 USA
        Phone: 602-343-8724
        
        


This email message, including any attachments, is for the so...{{dropped}}

______________________________________________
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to