Re: [R] cannot allocate memory block of size 2.7 Gb

2013-01-26 Thread Uwe Ligges



On 23.01.2013 23:41, Cláudio Brisolara wrote:





Hello R-users

I am getting error messagens when I require some packages or execute some 
procedures, like these below:


require(tseries)

Loading required package: tseries
Error in get(Info[i, 1], envir = env) :
   cannot allocate memory block of size 2.7 Gb

require (TSA)

Loading required package: TSA
Loading required package: locfit
Error in get(Info[i, 1], envir = env) :
   cannot allocate memory block of size 2.7 Gb
Failed with error:  ‘package ‘locfit’ could not be loaded’

I used the commands memory.limit() and memory.size() to check memory 
limitation, but I could not see any problem. I send also sessionInfo() data. I 
have run the same script and different computers with less memory capacity, so 
it seems to me that it is not a real memory problem.


memory.limit()

[1] 6004


Apparently you need  ore than that.
But we do not know how much your workspace is messed up  or what you did 
that at least 2.7 Gb additional memory is required in your next step.


Best,
Uwe Ligges






memory.size()

[1] 1361.88


sessionInfo()

R version 2.15.2 (2012-10-26)
Platform: x86_64-w64-mingw32/x64 (64-bit)

locale:
[1] LC_COLLATE=Portuguese_Brazil.1252  LC_CTYPE=Portuguese_Brazil.1252
[3] LC_MONETARY=Portuguese_Brazil.1252 LC_NUMERIC=C
[5] LC_TIME=Portuguese_Brazil.1252

loaded via a namespace (and not attached):
[1] grid_2.15.2  quadprog_1.5-4   stabledist_0.6-5 tools_2.15.2
[5] xtable_1.7-0


Please, someone can help me understand that is happening and what should I do 
to fix it?

Regards,

Cláudio Brisolara
Postgraduate student
University of São Paulo

[[alternative HTML version deleted]]



__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.



__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] cannot allocate memory block of size 2.7 Gb

2013-01-23 Thread Cláudio Brisolara




Hello R-users

I am getting error messagens when I require some packages or execute some 
procedures, like these below:

 require(tseries)
Loading required package: tseries
Error in get(Info[i, 1], envir = env) : 
  cannot allocate memory block of size 2.7 Gb

 require (TSA)
Loading required package: TSA
Loading required package: locfit
Error in get(Info[i, 1], envir = env) : 
  cannot allocate memory block of size 2.7 Gb
Failed with error:  ‘package ‘locfit’ could not be loaded’

I used the commands memory.limit() and memory.size() to check memory 
limitation, but I could not see any problem. I send also sessionInfo() data. I 
have run the same script and different computers with less memory capacity, so 
it seems to me that it is not a real memory problem.

 memory.limit()
[1] 6004
 memory.size()
[1] 1361.88

 sessionInfo()
R version 2.15.2 (2012-10-26)
Platform: x86_64-w64-mingw32/x64 (64-bit)

locale:
[1] LC_COLLATE=Portuguese_Brazil.1252  LC_CTYPE=Portuguese_Brazil.1252   
[3] LC_MONETARY=Portuguese_Brazil.1252 LC_NUMERIC=C  
[5] LC_TIME=Portuguese_Brazil.1252
 
loaded via a namespace (and not attached):
[1] grid_2.15.2  quadprog_1.5-4   stabledist_0.6-5 tools_2.15.2
[5] xtable_1.7-0


Please, someone can help me understand that is happening and what should I do 
to fix it?

Regards,

Cláudio Brisolara
Postgraduate student
University of São Paulo
  
[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Cannot allocate memory block

2011-02-17 Thread Uwe Ligges



On 16.02.2011 22:38, poisontonic wrote:



Uwe Ligges-3 wrote:


If the available space got too fragmented, there is not single 3.8 block
of memory available any more



Is there anything I can do to prevent this?


If you did it after a fresh reboot: I don't see a way to prevent it.
Nevertheless, I doubt you really have that much memory free in that 
case. Have you inspected how much memory was already allocated by R?


uwe Ligges


I've restarted and rerun the

whole thing straight up, and still the error...?

Ben


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Cannot allocate memory block

2011-02-16 Thread Uwe Ligges



On 15.02.2011 21:05, poisontonic wrote:


Hi, I'm using the latest version of 64-bit R for Windows: R x64 2.12.1

I'm using it because I currently need to do hierarchical clustering on a
very large object (too big for MATLAB, which I normally use).
When I try to cluster my distance matrix d (obtained using dist on my design
matrix):
hc- hclust(d, method = 'average')

I get an errror:
Error in hclust(d, method = average) :
   cannot allocate memory block of size 3.8 Gb

However, the memory limits appear to be 16GB:

memory.limit()

[1] 16378

Does anyone know why R cannot allocate a memory block of size 3.8 GB, even
though this is well within its memory limits??


If the available space got too fragmented, there is not single 3.8 block 
of memory available any more


Uwe Ligges







Any help would be greatly appreciated!
Thanks alot,

Ben


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Cannot allocate memory block

2011-02-16 Thread poisontonic


Uwe Ligges-3 wrote:
 
 If the available space got too fragmented, there is not single 3.8 block 
 of memory available any more
 

Is there anything I can do to prevent this? I've restarted and rerun the
whole thing straight up, and still the error...?

Ben
-- 
View this message in context: 
http://r.789695.n4.nabble.com/Cannot-allocate-memory-block-tp3307526p3309790.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Cannot allocate memory block

2011-02-15 Thread poisontonic

Hi, I'm using the latest version of 64-bit R for Windows: R x64 2.12.1

I'm using it because I currently need to do hierarchical clustering on a
very large object (too big for MATLAB, which I normally use).
When I try to cluster my distance matrix d (obtained using dist on my design
matrix):
hc - hclust(d, method = 'average')

I get an errror:
Error in hclust(d, method = average) : 
  cannot allocate memory block of size 3.8 Gb

However, the memory limits appear to be 16GB:
 memory.limit()
[1] 16378

Does anyone know why R cannot allocate a memory block of size 3.8 GB, even
though this is well within its memory limits??
Any help would be greatly appreciated!
Thanks alot,

Ben
-- 
View this message in context: 
http://r.789695.n4.nabble.com/Cannot-allocate-memory-block-tp3307526p3307526.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Cannot allocate memory of size x on Linux - what's the solution?

2009-09-29 Thread davew0000

Hi all,

I'm running an analysis with the random forest tool. It's being applied to a
data matrix of ~60,000 rows and between about 40 and 200 columns. I get the
same error with all of the data files (Cannot allocate vector of size
428.5MB). 

I found dozens of threads regarding this problem, but they never seem to be
concluded. Usually the OP is directed to the memory allocation help file
(which I haven't understood the solution for linux), and the last post is
the OP saying they haven't sorted out their problem yet. 

I'm running on a LINUX machine wtih 64GB RAM, so it's not a problem with
lack of system resources. 

Can anyone tell me how I can get R to allocate larger vectors on Linux? 

Many thanks,

Dave
-- 
View this message in context: 
http://www.nabble.com/Cannot-allocate-memory-of-size-x-on-Linux---what%27s-the-solution--tp25659271p25659271.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Cannot allocate memory of size x on Linux - what's the solution?

2009-09-29 Thread Uwe Ligges



davew wrote:

Hi all,

I'm running an analysis with the random forest tool. It's being applied to a
data matrix of ~60,000 rows and between about 40 and 200 columns. I get the
same error with all of the data files (Cannot allocate vector of size
428.5MB). 


I found dozens of threads regarding this problem, but they never seem to be
concluded. Usually the OP is directed to the memory allocation help file
(which I haven't understood the solution for linux), and the last post is
the OP saying they haven't sorted out their problem yet. 


I'm running on a LINUX machine wtih 64GB RAM, so it's not a problem with
lack of system resources. 

Can anyone tell me how I can get R to allocate larger vectors on Linux? 



1. Check how much memory R used at the point the error message appeared. 
If it is round about 60 Gb, you know that it is lack of resources - for 
the given problem. If it is much less (around 2Gb), you might have a 
32-bit R binary or you have some memory quota for your process.


Uwe Ligges




Many thanks,

Dave


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Cannot allocate memory of size x on Linux - what's the solution?

2009-09-29 Thread Dave Wood
Thanks for the responses

@Patrik Burns

I'm going to try running on a 64 bit machine. Unfortunately R isn't
installed properly on it yet and our admin guy is away, so it'll have to
wait.

@ Uwe Ligges

Unless the program suddenly starts generating masses and masses of data, I
don't think this is the problem. I've kept an eye on how much memory the
program is using and it's never got more than about 5% of the memory
available.



On 9/29/09, Uwe Ligges lig...@statistik.tu-dortmund.de wrote:



 davew wrote:

 Hi all,

 I'm running an analysis with the random forest tool. It's being applied to
 a
 data matrix of ~60,000 rows and between about 40 and 200 columns. I get
 the
 same error with all of the data files (Cannot allocate vector of size
 428.5MB).
 I found dozens of threads regarding this problem, but they never seem to
 be
 concluded. Usually the OP is directed to the memory allocation help file
 (which I haven't understood the solution for linux), and the last post is
 the OP saying they haven't sorted out their problem yet.
 I'm running on a LINUX machine wtih 64GB RAM, so it's not a problem with
 lack of system resources.
 Can anyone tell me how I can get R to allocate larger vectors on Linux?



 1. Check how much memory R used at the point the error message appeared. If
 it is round about 60 Gb, you know that it is lack of resources - for the
 given problem. If it is much less (around 2Gb), you might have a 32-bit R
 binary or you have some memory quota for your process.

 Uwe Ligges



 Many thanks,

 Dave



[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] cannot allocate memory

2008-09-24 Thread Uwe Ligges



DumpsterBear wrote:

I am getting Error: cannot allocate vector of size 197 MB.
I know that similar problems were discussed a lot already, but I
didn't find any satisfactory answers so far!

Details:
*** I have XP (32bit) with 4GB ram. At the time when the problem
appeared I had 1.5GB of available physical memory.
*** I increased R memory limit to 3GB via memory.limit(3000)



Have you told Windows to allow processes of more than 2GB?



*** I did gs() and got


gc(), I think.

Uwe Ligges



used  (Mb) gc trigger   (Mb)  max used   (Mb)
Ncells147534   4.0 407500   10.9407500   10.9
Vcells 104939449 800.7  186388073 1422.1 185874684 1418.2

The garbage collection didn't help.

Any ideas? Many thanks in advance!

-- Adam

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] cannot allocate memory

2008-09-24 Thread Bernardo Rangel Tura
Em Ter, 2008-09-23 às 21:42 -0400, DumpsterBear escreveu:
 I am getting Error: cannot allocate vector of size 197 MB.
 I know that similar problems were discussed a lot already, but I
 didn't find any satisfactory answers so far!
 
 Details:
 *** I have XP (32bit) with 4GB ram. At the time when the problem
 appeared I had 1.5GB of available physical memory.
 *** I increased R memory limit to 3GB via memory.limit(3000)
 *** I did gs() and got
 used  (Mb) gc trigger   (Mb)  max used   (Mb)
 Ncells147534   4.0 407500   10.9407500   10.9
 Vcells 104939449 800.7  186388073 1422.1 185874684 1418.2
 
 The garbage collection didn't help.
 
 Any ideas? Many thanks in advance!
 

Adam,

First, is possible 32bit XP use all your 4Gb?

Second, I think you say gc whem say gs, so in my computer (Ubuntu
64bit with 4Gb):

 gc()
 used (Mb) gc trigger (Mb) max used (Mb)
Ncells 188975 10.1 407500 21.8   35 18.7
Vcells 169133  1.3 786432  6.0   786378  6.0

 gc(reset=T)
 used (Mb) gc trigger (Mb) max used (Mb)
Ncells 188951 10.1 407500 21.8   188951 10.1
Vcells 168893  1.3 786432  6.0   168893  1.3

If you read gc help:

reset: logical; if 'TRUE' the values for maximum space used are
  reset to the current values.

Other issue is options for rgui command.
Have a option --max-mem-size that you modify to expand you RAM
avaiable

-- 
Bernardo Rangel Tura, M.D,MPH,Ph.D
National Institute of Cardiology
Brazil

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] cannot allocate memory

2008-09-24 Thread DumpsterBear
 I am getting Error: cannot allocate vector of size 197 MB.
 I know that similar problems were discussed a lot already, but I
 didn't find any satisfactory answers so far!

 Details:
 *** I have XP (32bit) with 4GB ram. At the time when the problem
 appeared I had 1.5GB of available physical memory.
 *** I increased R memory limit to 3GB via memory.limit(3000)


 Have you told Windows to allow processes of more than 2GB?

Yes, I did. But this only matters if R requests a memory block of size
bigger than 2GB
at one time. As I wrote, I had 1.5GB available of physical memory out of my 4GB.

 gc(), I think.

Yes, indeed.

Many thanks, Adam

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] cannot allocate memory

2008-09-23 Thread DumpsterBear
I am getting Error: cannot allocate vector of size 197 MB.
I know that similar problems were discussed a lot already, but I
didn't find any satisfactory answers so far!

Details:
*** I have XP (32bit) with 4GB ram. At the time when the problem
appeared I had 1.5GB of available physical memory.
*** I increased R memory limit to 3GB via memory.limit(3000)
*** I did gs() and got
used  (Mb) gc trigger   (Mb)  max used   (Mb)
Ncells147534   4.0 407500   10.9407500   10.9
Vcells 104939449 800.7  186388073 1422.1 185874684 1418.2

The garbage collection didn't help.

Any ideas? Many thanks in advance!

-- Adam

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.