On Tue, Jan 8, 2013 at 6:41 PM, Sean Owen <sro...@gmail.com> wrote:

> There's definitely a QR decomposition in there for me since solving A
> = X Y' for X  is  X = A Y (Y' * Y)^-1  and you need some means to
> compute the inverse of that (small) matrix.
>
>
Sean,
I think I got it.
1) A Y is a handful of sparse matrix-vector products,
2) Y' Y is a dense matrix-matrix on a "flat" matrix and a "tall" matrix,
producing a small square matrix,
3) inverting that matrix is not a big deal, since it is small.
Great!
Thanks!
It just was not immediately obvious to me at first look.

Now, the transition from ratings to 1s and 0s,
is this simply to handle implicit feedback,
or is this for some other reason?



> On Tue, Jan 8, 2013 at 5:27 PM, Ted Dunning <ted.dunn...@gmail.com> wrote:
> > This particular part of the algorithm can be seen as similar to a least
> > squares problem that might normally be solved by QR.  I don't think that
> > the updates are quite the same, however.
> >
> > On Tue, Jan 8, 2013 at 3:10 PM, Sebastian Schelter <s...@apache.org>
> wrote:
> >
> >> This factorization is iteratively refined. In each iteration, ALS first
> >> fixes the item-feature vectors and solves a least-squares problem for
> >> each user and then fixes the user-feature vectors and solves a
> >> least-squares problem for each item.
> >>
>

Reply via email to