Hello,

I know of some various methods out there to utilize multiple processors but
am not sure what the best solution would be. First some things to note:
I'm running dependent simulations, so direct parallel coding is out
(multicore, doSnow, etc).
I'm on Windows, and don't know C. I don't plan on learning C or any of the
*nix languages.

My main concern deals with Multiple analyses on large data sets. By large I
mean that when I'm done running 2 simulations R is using ~3G of RAM, the
remaining ~3G is chewed up when I try to create the Gelman-Rubin statistic
to compare the two resulting samples, grinding the process to a halt. I'd
like to have separate cores simultaneously run each analysis. That will save
on time and I'll have to ponder the BGR calculation problem another way. Can
R temporarily use HD space to write calculations to instead of RAM?

The second concern boils down to whether or not there is a way to split up
dependent simulations. For example at iteration (t) I feed a(t-2) into FUN1
to generate a(t), then feed a(t), b(t-1) and c(t-1) into FUN2 to simulate
b(t) and c(t). I'd love to have one core run FUN1 and another run FUN2, and
better yet, a third to run all the pre-and post- processing tidbits!


So if anyone has any suggestions as to a direction I can look into, it would
be appreciated.


Robin Jeffries
MS, DrPH Candidate
Department of Biostatistics
UCLA
530-633-STAT(7828)

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to