[BangPypers] Multiplying very large matrices

2011-01-15 Thread kunal ghosh
Hi all, while implementing Locality Preserving Projections , at one point i have to perform X L X.transpose() these matrices are large (32256 x 32256) so i get "out of memory" error. I assume, as the dataset gets larger one would come across this problem , how would one go about solving this ? Is

Re: [BangPypers] Multiplying very large matrices

2011-01-15 Thread Santosh Rajan
Hope this helps http://stackoverflow.com/questions/1053928/python-numpy-very-large-matrices On Sat, Jan 15, 2011 at 10:11 PM, kunal ghosh wrote: > Hi all, > while implementing Locality Preserving Projections , > at one point i have to perform X L X.transpose() > these matrices are large (32256 x

Re: [BangPypers] Multiplying very large matrices

2011-01-15 Thread kunal ghosh
Thanks Santosh , This stack overflow thread indeed discusses the exact same problem i have. Wonder how i missed it :) in my preliminary searches. thanks again ! On Sat, Jan 15, 2011 at 11:12 PM, Santosh Rajan wrote: > Hope this helps > http://stackoverflow.com/questions/1053928/python-numpy-ve

Re: [BangPypers] Multiplying very large matrices

2011-01-15 Thread kunal ghosh
Hi all, I found numpy.memmap to be very suitable when matrices larger than the physical memory are required. 1. included in standard numpy installation 2. very low learning curve. pyTables seems to be more suitable but , i somehow found the learning curve too steep . Also pyTables needs lot of in