Koobas,

ALS doesn't apply QR decomposition to the user-item-matrix. It
factorizes this matrix into two smaller matrices which contain the user
and item features.

Each user-feature vector is modeled as a linear combination of the
feature vectors of the items which the user rated (and vice versa for
the item-feature vectors).

This factorization is iteratively refined. In each iteration, ALS first
fixes the item-feature vectors and solves a least-squares problem for
each user and then fixes the user-feature vectors and solves a
least-squares problem for each item.

I suggest that you read the papers I referred to in the last mail, which
in detail explain the algorithm.

Best,
Sebastian


On 08.01.2013 23:55, Koobas wrote:
> I am trying to wrap my head around it.
> From the Mahout documentation it looks like it's a standard (dense) QR with
> Householder reflectors.
> But the user-item matrix is usually extremely sparse.
> So, is the sparsity somehow taken into account,
> or are the sparse right-hand-side vectors packed in dense storage and hit
> with Householder?
> The underlying question being the computational complexity, i.e. number of
> floating point operations involved.
> 
> 
> On Tue, Jan 8, 2013 at 4:03 PM, Sebastian Schelter <s...@apache.org> wrote:
> 
>> Hi Koobas,
>>
>> We have two classes that implement the solutions described in the
>> ALS-related papers:
>>
>> For explicit feedback data [1] we have AlternatingLeastSquaresSolver,
>> for implicit feedback data [2] we have
>> ImplicitFeedback-AlternatingLeastSquaresSolver. Both can be found in the
>> org.apache.mahout.math.als package.
>>
>> Internally the use Mahout's QRDecomposition to solve the linear systems
>> associated with ALS.
>>
>> Best,
>> Sebastian
>>
>> [1]
>>
>> http://www.hpl.hp.com/personal/Robert_Schreiber/papers/2008%20AAIM%20Netflix/netflix_aaim08(submitted).pdf
>> [2] http://research.yahoo.com/pub/2433
>>
>>
>> On 08.01.2013 21:53, Koobas wrote:
>>> Perhaps somebody can shed some light on the topic.
>>> What algorithm is used to solve the least squares problem
>>> when computing low-rank approximation using Alternating Least Squares?
>>>
>>
>>
> 

Reply via email to