Re: Tuning parameters for ALS-WR

2013-09-11 Thread Sean Owen
On Wed, Sep 11, 2013 at 12:22 AM, Parimi Rohit rohit.par...@gmail.comwrote: 1. Do we have to follow this setting, to compare algorithms? Can't we report the parameter combination for which we get highest mean average precision for the test data, when trained on the train set, with out any

Re: Tuning parameters for ALS-WR

2013-09-11 Thread Ted Dunning
On Wed, Sep 11, 2013 at 12:07 AM, Sean Owen sro...@gmail.com wrote: 2. Do we have to tune the similarityclass parameter in item-based CF? If so, do we compare the mean average precision values based on validation data, and then report the same for the test set? Yes you are

Tuning parameters for ALS-WR

2013-09-10 Thread Parimi Rohit
Hi All, I was wondering if there is any experimental design to tune the parameters of ALS algorithm in mahout, so that we can compare its recommendations with recommendations from another algorithm. My datasets have implicit data and would like to use the following design for tuning the ALS

Re: Tuning parameters for ALS-WR

2013-09-10 Thread Ted Dunning
You definitely need to separate into three sets. Another way to put it is that with cross validation, any learning algorithm needs to have test data withheld from it. The remaining data is training data to be used by the learning algorithm. Some training algorithms such as the one that you