I've just finished implementing a collaborative recommender system based on 
views, sales and similar user-product interactions. This approach works quite 
well and I am pleased with the results. 

Nevertheless, I am want to know what influence the integration of some item 
specific features would have. Approaches like computing the similarity of user 
and item vectors within the same feature space (like term vectors of product 
descriptions) are easy to scale with MapReduce. 

I wonder how model based approaches might be scaled to a large number of users. 
My understanding is that I would have to train some model like a decision tree 
or naive bayes (or regression … etc.)  for each user and do the prediction for 
each item using this model. 

Is there any common approach to get those techniques scaling up with larger 
datasets?


Many thank in advance,
Dominik

Reply via email to