On Mar 24, 2009, at 14:55 , Booman, M wrote:

Dear all,

I am going to purchase a Power Mac (a new one, with Nehalem processor) for my R-based microarray analyses. I use mainly Bioconductor packages, and a typical dataset would consist of 50 microarrays with 40,000 datapoints each. To make the right choice of processor and memory, I have a few questions:


I don't use BioC [you may want to ask on the BioC list instead (or hopefully some BioC users will chip in)], so my recommendations may be based on slightly different problems.


- would the current version of R benefit from the 8 cores in the new Intel Xeon Nehalem 8-core Mac Pro? So would an 8-core 2.26GHz machine be better than a 4-core 2.93GHz?

Unfortunately I cannot comment on Nehalems, but in general with Xeons you do feel quite a difference in the clock speed, so I wouldn't trade 2.93GHz for 2.26GHz regardless of the CPU generation. It is true that pre-Nehalem Mac Pros cannot feed 8 cores, so you want go for the new Mac Pros, but I wouldn't even think about the 2.26GHz option. Some benchmarks suggest that the 2.26 Nehalem can still compete favorably if a lot of memory/io is involved, but it was not very convincing and I cannot tell first hand.


Or can R only use one core (in which case the 4-core 2.93GHZ machine would be better)?


R can use multiple cores in many ways - through BLAS (default in R for Mac OS X), vector op parallelization (Luke's pnmath) or explicit parallelization such as forking (multicore) or parallel processes (snow). The amount of parallelization achievable depends heavily on your applications. I use routinely all cores, but then I'm usually modeling my problems that way.


- If R does not benefot from multiple cores yet, is there anything known about whether Snow Leopard might make a difference in this?


I cannot comment on ongoing work details due to DNA associated with Snow Leopard, but technically from the Apple announcements you can deduce that the only possible improvements directly related to R can be achieved in the implicit parallelization which is essentially the pnmath path. There is not much more you can do in R save for a re- write of the methods you want to deal with.

In fact, the hope is rather that the packages for R start using parallelization more effectively, but that's not something Snow Leopard alone can change.


- To determine if my first priority should be processor speed or RAM, on which does R rely more heavily?


In my line of work (which is not bioinf, though) RAM turned to be more important, because the drop off when you run out of memory is sudden and devastatingly huge. With CPUs you'll have to wait a bit longer, but the difference is directly proportional to the CPU speed you get, so it is never as bad as running out of wired RAM. (BTW: in general you don't want to buy RAM from Apple - as much as I like Apple, there are compatible RAM sets at a fraction of the cost of what Apple charges, especially for Mac Pros - but there is always the 1st generation issue *).


- The new chipset has 3 memory channels (forgive me if I word this wrong, as you may have noticed I am no computer tech) so it can read 6Gb RAM faster than it can read 8Gb of RAM; so for a program that relies more on RAM speed than RAM quantity it is recommended to use 6Gb instead of 8 for better performance (or any multiple of 3). Which is more important for R, RAM speed or RAM quantity?


6GB is very little RAM, so I don't think that's an option ;) - but yes, you should care about the size first. The channels and timings only define how you populate the slots. Note that the 4-core Nehalem has only 4 slots, so it's not very expandable - I'd definitely get a 8- core old one with 16GB RAM or more rather than something that can take only 8GB ...


(I am not sure if it helps to know, but previously I used a Powermac G5 quadcore (sadly I forgot which processor speed but it was the standard G5 quadcore) with 4 Gb RAM for datasets of 30-40 microarrays of 18,000 datapoints each, and analysis was OK except for some memory errors in a script that used permutation analysis; but it wasn't very fast.)


I would keep an eye on the RAM expansibility - even if you buy less RAM now, a ceiling of 8GB is very low. It may turn out that larger DIMMs will become available, but 16GB for the future is not enough, either. As with all 1st generation products the prices will go down a lot over time, so you may plan to upgrade later. Another piece worth considering is that you can always update RAM easily, but CPU upgrade is much more difficult.

Cheers,
Simon

_______________________________________________
R-SIG-Mac mailing list
R-SIG-Mac@stat.math.ethz.ch
https://stat.ethz.ch/mailman/listinfo/r-sig-mac

Reply via email to