There's a formula for random projection. Something along the lines of "if
your data has error e, a function on (new dimensions, old dimensions, e)".
 (Or maybe I found this in one of the +1/-1 random projection papers.)

But anyway a function on the error of the original data seems more useful
than an absolute error.

On Sun, Nov 6, 2011 at 4:33 PM, Ted Dunning <[email protected]> wrote:

> I think that this gives you a measure of the error due to all of the
> missing eigenvectors/values, not the accuracy of the ones that are present.
>
> Both questions are interesting ones and the answers are clearly somewhat
> related.
>
> On Sun, Nov 6, 2011 at 4:29 PM, Danny Bickson <[email protected]
> >wrote:
>
> > Regarding accuracy, you can also compute norm(T- V'*A*V)
> > where T is the tridiagonal matrix, V is a matrix with the intermediate
> > solution vectors v_i , and A is the original matrix you want to
> > approximate. (If A is not symmetric then replace it with A*A')
> >
> > On Sun, Nov 6, 2011 at 7:19 PM, Jake Mannix <[email protected]>
> wrote:
> >
> > > The idea is this: after k iterations, you have a k-by-k tri-diagonal
> > > matrix, the
> > > eigenvalues of *this* are *an approximation* to the top k eigenvalues
> of
> > > A*A'.  The largest of these k will have the least error, the next will
> > have
> > > more,
> > > and so on down the line.  I've typically assumed that the bottom 1/3rd
> or
> > > so of the k are not going to be very accurate, but it depends on the
> > > spectrum
> > > of eigenvalues of A*A', sometimes all but the last few are very
> > accurately
> > > computed (but sometimes it's even worse:  if the spectrum is very flat,
> > the
> > > mixing between eigenvectors can be near complete)
> > >
> > > But you don't need a formula: look at EigenVerificationJob for the way
> to
> > > measure your accuracy, you can just run this on your data after you've
> > > extracted out your first k singular vector/value pairs, and see how far
> > > into
> > > this list you want to keep.  The measurement I use is "how close to an
> > > eigenvector is this vector?".  I.e. how big is 1 - (v/|v|) .
> > > ((AA'v)/|AA'v|) ?
> > >
> > >  -jake
> > >
> > >
> > > On Sun, Nov 6, 2011 at 3:59 PM, Lance Norskog <[email protected]>
> wrote:
> > >
> > > > Is there a formula (somewhere easy to find) for "slop" and
> "accurate"?
> > > >
> > > > On Sun, Nov 6, 2011 at 3:57 PM, Ted Dunning <[email protected]>
> > > wrote:
> > > >
> > > > > Almost exactly.  If you stop at k iterations, you don't get the
> top k
> > > > > eigenvalues precisely.  You need some extra slop to make sure your
> > top
> > > > > eigenvalues are accurate.
> > > > >
> > > > > On Sun, Nov 6, 2011 at 3:46 PM, Danny Bickson <
> > [email protected]
> > > > > >wrote:
> > > > >
> > > > > > Exactly, in each iteration you extract one more eigenvalue from
> > > largest
> > > > > to
> > > > > > smallest. If you run more iterations then the matrix rank then
> you
> > > get
> > > > > > again all eigenvalues. In that case you may encounter each
> > > eigenvalue a
> > > > > few
> > > > > > times.
> > > > > >
> > > > > > On Sun, Nov 6, 2011 at 6:42 PM, Sean Owen <[email protected]>
> > wrote:
> > > > > >
> > > > > > > True though I thought it was intended to run through all m
> steps
> > -
> > > is
> > > > > it
> > > > > > > simply that you stop after some number k and that still leads
> you
> > > to
> > > > > > > compute the k largest eigenvectors of the original?
> > > > > > >  On Nov 6, 2011 11:36 PM, "Danny Bickson" <
> > [email protected]
> > > >
> > > > > > wrote:
> > > > > > >
> > > > > > > > To be exact, the size of the tridiagonal matrix
> > > > > > > > is number of iterations + 1. See description of the matrix
> T_mm
> > > > > > > > in Wikipedia: http://en.wikipedia.org/wiki/Lanczos_algorithm
> > > > > > > >
> > > > > > > > Best,
> > > > > > > >
> > > > > > > > DB
> > > > > > > >
> > > > > > > > On Sun, Nov 6, 2011 at 6:30 PM, Sean Owen <[email protected]>
> > > > wrote:
> > > > > > > >
> > > > > > > > > Oh do you only need the top k x k bit of the tridiagonal to
> > > find
> > > > > the
> > > > > > > > > top k eigenvalues?
> > > > > > > > >
> > > > > > > > > I really don't want to write a QR decomposer, phew.
> > > > > > > > >
> > > > > > > > > On Sun, Nov 6, 2011 at 11:26 PM, Ted Dunning <
> > > > > [email protected]>
> > > > > > > > > wrote:
> > > > > > > > > > The tridiagonal is much smaller than you would need if
> you
> > > > wanted
> > > > > > all
> > > > > > > > the
> > > > > > > > > > eigenvalues.  Since you only want a small number, you
> only
> > > > have a
> > > > > > > > > > tri-diagonal matrix that is some multiple of that size.
> > > > >  In-memory
> > > > > > > > > > decomposition makes total sense.
> > > > > > > > > >
> > > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > > Lance Norskog
> > > > [email protected]
> > > >
> > >
> >
>



-- 
Lance Norskog
[email protected]

Reply via email to