Re: [Numpy-discussion] very large matrices.

2007-05-14 Thread Dave P. Novakovic
ourse, more physical details would be helpful in > better understanding your problem. > good luck, > val > > - Original Message ----- > From: "Dave P. Novakovic" <[EMAIL PROTECTED]> > To: "Discussion of Numerical Python" > Sent: Sunday, May 13, 2

Re: [Numpy-discussion] very large matrices.

2007-05-14 Thread val
an option in this type of problems. Of course, more physical details would be helpful in better understanding your problem. good luck, val - Original Message - From: "Dave P. Novakovic" <[EMAIL PROTECTED]> To: "Discussion of Numerical Python" Sent: Sund

Re: [Numpy-discussion] very large matrices.

2007-05-14 Thread Zachary Pincus
Hello Dave, I don't know if this will be useful to your research, but it may be worth pointing out in general. As you know PCA (and perhaps some other spectral algorithms?) use eigenvalues of matrices that can be factored out as A'A (where ' means transpose). For example, in the PCA case,

Re: [Numpy-discussion] very large matrices.

2007-05-14 Thread Giorgio Luciano
If you are using it for making a PCA, why dont' you try to use nipals algorithm ? (probably a silly question , just wanted to give help :) Giorgio > > ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listi

Re: [Numpy-discussion] very large matrices.

2007-05-13 Thread Dave P. Novakovic
There are definitely elements of spectral graph theory in my research too. I'll summarise We are interested in seeing the each eigenvector from svd can represent in a semantic space In addition to this we'll be testing it against some algorithms like concept indexing (uses a bipartitional k-meansi

Re: [Numpy-discussion] very large matrices.

2007-05-13 Thread Charles R Harris
On 5/13/07, Dave P. Novakovic <[EMAIL PROTECTED]> wrote: > Are you trying some sort of principal components analysis? PCA is indeed one part of the research I'm doing. I had the impression you were trying to build a linear space in which to embed a model, like atmospheric folk do when they t

Re: [Numpy-discussion] very large matrices.

2007-05-13 Thread Dave P. Novakovic
> Are you trying some sort of principal components analysis? PCA is indeed one part of the research I'm doing. Dave ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion

Re: [Numpy-discussion] very large matrices.

2007-05-13 Thread Charles R Harris
On 5/13/07, Dave P. Novakovic <[EMAIL PROTECTED]> wrote: They are very large numbers indeed. Thanks for giving me a wake up call. Currently my data is represented as vectors in a vectorset, a typical sparse representation. I reduced the problem significantly by removing lots of noise. I'm basic

Re: [Numpy-discussion] very large matrices.

2007-05-12 Thread Dave P. Novakovic
They are very large numbers indeed. Thanks for giving me a wake up call. Currently my data is represented as vectors in a vectorset, a typical sparse representation. I reduced the problem significantly by removing lots of noise. I'm basically recording traces of a terms occurrence throughout a cor

Re: [Numpy-discussion] very large matrices.

2007-05-12 Thread Anne Archibald
On 12/05/07, Dave P. Novakovic <[EMAIL PROTECTED]> wrote: > core 2 duo with 4gb RAM. > > I've heard about iterative svd functions. I actually need a complete > svd, with all eigenvalues (not LSI). I'm actually more interested in > the individual eigenvectors. > > As an example, a single row could

Re: [Numpy-discussion] very large matrices.

2007-05-12 Thread Dave P. Novakovic
Hey, thanks for the response. core 2 duo with 4gb RAM. I've heard about iterative svd functions. I actually need a complete svd, with all eigenvalues (not LSI). I'm actually more interested in the individual eigenvectors. As an example, a single row could probably have about 3000 non zero elemen

Re: [Numpy-discussion] very large matrices.

2007-05-12 Thread Charles R Harris
On 5/12/07, Dave P. Novakovic <[EMAIL PROTECTED]> wrote: Hi, I have test data of about 75000 x 75000 dimensions. I need to do svd, or at least an eigen decomp on this data. from search suggests to me that the linalg functions in scipy and numpy don't work on sparse matrices. I can't even get

[Numpy-discussion] very large matrices.

2007-05-12 Thread Dave P. Novakovic
Hi, I have test data of about 75000 x 75000 dimensions. I need to do svd, or at least an eigen decomp on this data. from search suggests to me that the linalg functions in scipy and numpy don't work on sparse matrices. I can't even get empty((1,1),dtype=float) to work (memory errors, or