ok. I understand now. but you would you express this loss function mathematically?

also there is one example when it wouldn't work:

-1 similarity to one user with a single 1 rating and +1 similarity to another user with a 5 rating. In this case, the
weighted average is undefined. but in practice this would be an easy 3.

Tamas


On 23/02/2010 23:32, Ted Dunning wrote:
On Tue, Feb 23, 2010 at 3:25 PM, Sean Owen<[email protected]>  wrote:

Yes I think I understand what you're getting at and the examples. Loss
function here is just the 'penalty' for predicting a rating near to
those of dissimilar users and far from those of similar users?

Yes.  Exactly.


If I read correctly, you think that a 'weighted average' (with
negative weights in numerator and denominator...) plus
capping is an intellectually sound way of handling this situation.

Exactly.  And I think that the examples demonstrate reasonable behavior in a
variety of regimes.


Reply via email to