[EMAIL PROTECTED] wrote:
> I'm running operations large arrays of floats, approx 25,000 x 80.
> Python (scipy) does not seem to come close to using 4GB of wired mem,
> but segments at around a gig. Everything works fine on smaller batches
> of data around 10,000 x 80 and uses a max of ~600mb of mem.  Any Ideas?
>  Is this just too much data for scipy?
> 
> Thanks Conor
> 
> Traceback (most recent call last):
>  File "C:\Temp\CR_2\run.py", line 68, in ?
>    net.rProp(1.2, .5, .000001, 50.0, input, output, 1)
>  File "/Users/conorrob/Desktop/CR_2/Network.py", line 230, in rProp
>    print scipy.trace(error*scipy.transpose(error))
>  File "D:\Python24\Lib\site-packages\numpy\core\defmatrix.py", line
> 149, in
> __mul__
>    return N.dot(self, other)
> MemoryError

You should ask this question on the numpy-discussion list for better 
feedback.


Does it actually segfault or give you this Memory Error?


Temporary arrays that need to be created could be the source of the 
extra memory.


Generally, you should be able to use all the memory on your system 
(unless you are on a 64-bit system and are not using Python 2.5).



-Travis

-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to