Case 1 is fine as is.

For Case 2 I would suggest to simply experiment, try different similarity measures like euclidean distance or cosine and see what gives the best results.

--sebastian

On 01/25/2014 04:08 AM, Koobas wrote:
A generic latent variable recommender question.
I passed the user-item matrix through a low rank approximation,
with either something like ALS or SVD, and now I have the feature
vectors for all users and all items.

Case 1:
I want to recommend items to a user.
I compute a dot product of the user’s feature vector with all feature
vectors of all the items.
I eliminate the ones that the user already has, and find the largest value
among the others, right?

Case 2:
I want to find similar items for an item.
Should I compute dot product of the item’s feature vector against feature
vectors of all the other items?
    OR
Should I compute the ANGLE between each par of feature vectors?
I.e., compute the cosine similarity?
I.e., normalize the vectors before computing the dot products?

If “yes” for case 2, is that something I should also do for case 1?


Reply via email to