I have a question concerning the working memory used by R.

I am running a simulation job (a function which replicates a simulation) which calls 
several other functions that I have written. Something like this:

name1<-function(number of replications){
        for(i in 1:replication)
                name2(parameters)
                name3()
                # get results and aggregate
                open..
                write(..., append=T)
        }
}

name2<-function(parameters){
        # simulate data
        ...     
        # save data and parameters
        write...
}

name3<-function(){
        # get data and parameters
        open...
        # analyse data
        ...
        # save results
        write.. 
}

name1(100)


However, after a couple of replications working memory seems to be full.

Error: cannot allocate vector of size ...Kb
In addition: Warning message: 
Reached total allocation of 126Mb: see help(memory.size)

This puzzles me, because all arrays (Note: large arrays, i.e., 500x500) are defined 
locally (within each function) and redefined every replication (the meta-function). So 
my question is: Why is R eating up my working memory while I think I am redefining the 
same (local) arrays? Any thoughts on how this can be circumvented?

Thanks,

Peter.

______________________________________________
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help

Reply via email to