My completely unfounded guess would be the sign is built into gradientBase

On Nov 28, 2012, at 2:19 PM, David Kincaid wrote:

> While trying to wrap my head around the Mahout code for SGD I noticed that
> the update to the beta terms seems to be doing gradient ascent and not
> descent. Could someone help me find the missing minus sign?
> 
> The line of code in question from AbstractOnlineLogisticRegression.java,
> train() is:
> 
>        double newValue = beta.getQuick(i, j) + gradientBase * learningRate
> * perTermLearningRate(j) * instance.get(j);
> 
> It looks to me like the update to beta is ascending the gradient (hence the
> addition sign instead of minus). Could you help me understand where my
> thinking is going wrong?
> 
> Thanks,
> 
> Dave

Reply via email to