Not particularly easy to understand.  See the Lanczos section in Matrix
Computations by Golub and van
Loan<http://www.amazon.com/Computations-Hopkins-Studies-Mathematical-Sciences/dp/0801854148>

There is a similar slop required for the random projection stuff.  You
could make a hand-waving argument that the slop is similarly motivated in
both cases, but the random projection probably needs more slop because it
commits to a sub-space on the first step.

On Sun, Nov 6, 2011 at 3:59 PM, Lance Norskog <[email protected]> wrote:

> Is there a formula (somewhere easy to find) for "slop" and "accurate"?
>
> On Sun, Nov 6, 2011 at 3:57 PM, Ted Dunning <[email protected]> wrote:
>
> > Almost exactly.  If you stop at k iterations, you don't get the top k
> > eigenvalues precisely.  You need some extra slop to make sure your top
> > eigenvalues are accurate.
> >
> > On Sun, Nov 6, 2011 at 3:46 PM, Danny Bickson <[email protected]
> > >wrote:
> >
> > > Exactly, in each iteration you extract one more eigenvalue from largest
> > to
> > > smallest. If you run more iterations then the matrix rank then you get
> > > again all eigenvalues. In that case you may encounter each eigenvalue a
> > few
> > > times.
> > >
> > > On Sun, Nov 6, 2011 at 6:42 PM, Sean Owen <[email protected]> wrote:
> > >
> > > > True though I thought it was intended to run through all m steps - is
> > it
> > > > simply that you stop after some number k and that still leads you to
> > > > compute the k largest eigenvectors of the original?
> > > >  On Nov 6, 2011 11:36 PM, "Danny Bickson" <[email protected]>
> > > wrote:
> > > >
> > > > > To be exact, the size of the tridiagonal matrix
> > > > > is number of iterations + 1. See description of the matrix T_mm
> > > > > in Wikipedia: http://en.wikipedia.org/wiki/Lanczos_algorithm
> > > > >
> > > > > Best,
> > > > >
> > > > > DB
> > > > >
> > > > > On Sun, Nov 6, 2011 at 6:30 PM, Sean Owen <[email protected]>
> wrote:
> > > > >
> > > > > > Oh do you only need the top k x k bit of the tridiagonal to find
> > the
> > > > > > top k eigenvalues?
> > > > > >
> > > > > > I really don't want to write a QR decomposer, phew.
> > > > > >
> > > > > > On Sun, Nov 6, 2011 at 11:26 PM, Ted Dunning <
> > [email protected]>
> > > > > > wrote:
> > > > > > > The tridiagonal is much smaller than you would need if you
> wanted
> > > all
> > > > > the
> > > > > > > eigenvalues.  Since you only want a small number, you only
> have a
> > > > > > > tri-diagonal matrix that is some multiple of that size.
> >  In-memory
> > > > > > > decomposition makes total sense.
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
>
>
>
> --
> Lance Norskog
> [email protected]
>

Reply via email to