ourse, more physical details would be helpful in
> better understanding your problem.
> good luck,
> val
>
> - Original Message -----
> From: "Dave P. Novakovic" <[EMAIL PROTECTED]>
> To: "Discussion of Numerical Python"
> Sent: Sunday, May 13, 2
an option in this type of
problems. Of course, more physical details would be helpful in
better understanding your problem.
good luck,
val
- Original Message -
From: "Dave P. Novakovic" <[EMAIL PROTECTED]>
To: "Discussion of Numerical Python"
Sent: Sund
Hello Dave,
I don't know if this will be useful to your research, but it may be
worth pointing out in general. As you know PCA (and perhaps some
other spectral algorithms?) use eigenvalues of matrices that can be
factored out as A'A (where ' means transpose). For example, in the
PCA case,
If you are using it for making a PCA,
why dont' you try to use nipals algorithm ?
(probably a silly question , just wanted to give help :)
Giorgio
>
>
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listi
There are definitely elements of spectral graph theory in my research
too. I'll summarise
We are interested in seeing the each eigenvector from svd can
represent in a semantic space
In addition to this we'll be testing it against some algorithms like
concept indexing (uses a bipartitional k-meansi
On 5/13/07, Dave P. Novakovic <[EMAIL PROTECTED]> wrote:
> Are you trying some sort of principal components analysis?
PCA is indeed one part of the research I'm doing.
I had the impression you were trying to build a linear space in which to
embed a model, like atmospheric folk do when they t
> Are you trying some sort of principal components analysis?
PCA is indeed one part of the research I'm doing.
Dave
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion
On 5/13/07, Dave P. Novakovic <[EMAIL PROTECTED]> wrote:
They are very large numbers indeed. Thanks for giving me a wake up call.
Currently my data is represented as vectors in a vectorset, a typical
sparse representation.
I reduced the problem significantly by removing lots of noise. I'm
basic
They are very large numbers indeed. Thanks for giving me a wake up call.
Currently my data is represented as vectors in a vectorset, a typical
sparse representation.
I reduced the problem significantly by removing lots of noise. I'm
basically recording traces of a terms occurrence throughout a cor
On 12/05/07, Dave P. Novakovic <[EMAIL PROTECTED]> wrote:
> core 2 duo with 4gb RAM.
>
> I've heard about iterative svd functions. I actually need a complete
> svd, with all eigenvalues (not LSI). I'm actually more interested in
> the individual eigenvectors.
>
> As an example, a single row could
Hey, thanks for the response.
core 2 duo with 4gb RAM.
I've heard about iterative svd functions. I actually need a complete
svd, with all eigenvalues (not LSI). I'm actually more interested in
the individual eigenvectors.
As an example, a single row could probably have about 3000 non zero elemen
On 5/12/07, Dave P. Novakovic <[EMAIL PROTECTED]> wrote:
Hi,
I have test data of about 75000 x 75000 dimensions. I need to do svd,
or at least an eigen decomp on this data. from search suggests to me
that the linalg functions in scipy and numpy don't work on sparse
matrices.
I can't even get
Hi,
I have test data of about 75000 x 75000 dimensions. I need to do svd,
or at least an eigen decomp on this data. from search suggests to me
that the linalg functions in scipy and numpy don't work on sparse
matrices.
I can't even get empty((1,1),dtype=float) to work (memory
errors, or
13 matches
Mail list logo