Hi, 

I am using numpy and wish to create very large arrays.   My system is AMD 64 x 
2 Ubuntu 8.04.  Ubuntu should be 64 bit. I have 3gb RAM and a 15 GB swap drive. 
 

The command I have been trying to use is; 
g=numpy.ones([1000,1000,1000],numpy.int32)

This returns a memory error.  
A smaller array ([500,500,500]) worked fine.. 
Two smaller arrays again crashed the system.

So... I did the math.  a 1000x1000x1000 array at 32 bits should be around 4gb 
RAM... Obviously larger than RAM, but much smaller than the swap drive.

1. So... does Numpy have a really lot of overhead? Or is my system just not 
somehow getting to make use of the 15gb swap area. 
2. Is there a way I can access the swap area, or direct numpy to do so? Or do I 
have to write out my own numpy cache system... 
3. How difficult is it to use data compression internally on numpy arrays? 

thanks very much
Robert
--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to