Thanks for the responses

@Patrik Burns

I'm going to try running on a 64 bit machine. Unfortunately R isn't
installed properly on it yet and our admin guy is away, so it'll have to
wait.

@ Uwe Ligges

Unless the program suddenly starts generating masses and masses of data, I
don't think this is the problem. I've kept an eye on how much memory the
program is using and it's never got more than about 5% of the memory
available.



On 9/29/09, Uwe Ligges <lig...@statistik.tu-dortmund.de> wrote:
>
>
>
> davew0000 wrote:
>
>> Hi all,
>>
>> I'm running an analysis with the random forest tool. It's being applied to
>> a
>> data matrix of ~60,000 rows and between about 40 and 200 columns. I get
>> the
>> same error with all of the data files (Cannot allocate vector of size
>> 428.5MB).
>> I found dozens of threads regarding this problem, but they never seem to
>> be
>> concluded. Usually the OP is directed to the memory allocation help file
>> (which I haven't understood the solution for linux), and the last post is
>> the OP saying they haven't sorted out their problem yet.
>> I'm running on a LINUX machine wtih 64GB RAM, so it's not a problem with
>> lack of system resources.
>> Can anyone tell me how I can get R to allocate larger vectors on Linux?
>>
>
>
> 1. Check how much memory R used at the point the error message appeared. If
> it is round about 60 Gb, you know that it is lack of resources - for the
> given problem. If it is much less (around 2Gb), you might have a 32-bit R
> binary or you have some memory quota for your process.
>
> Uwe Ligges
>
>
>
> Many thanks,
>>
>> Dave
>>
>

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to