graidentBase is coming from:
double gradientBase = gradient.get(i);

Prior to that: 
Vector gradient = this.gradient.apply(groupKey, actual, instance, this);

"this.gradient" is an instance of DefaultGradient (in the same project).  The 
last two lines of the apply function are:
r.assign(v, Functions.MINUS);
return r;

This appears to be where the gradient values are negated.




-----Original Message-----
From: David Kincaid [mailto:[email protected]] 
Sent: Wednesday, November 28, 2012 1:41 PM
To: [email protected]
Subject: Re: Mahout SGD - is it really descent?

I thought it might be too, but doesn't look like it to me. Of course, I really 
have a hard time following vector and matrix math done in Java. Does
v.minus(r) mean v - r or r - v?

On Wed, Nov 28, 2012 at 1:28 PM, David Arthur <[email protected]> wrote:

> My completely unfounded guess would be the sign is built into 
> gradientBase
>
> On Nov 28, 2012, at 2:19 PM, David Kincaid wrote:
>
> > While trying to wrap my head around the Mahout code for SGD I 
> > noticed
> that
> > the update to the beta terms seems to be doing gradient ascent and 
> > not descent. Could someone help me find the missing minus sign?
> >
> > The line of code in question from 
> > AbstractOnlineLogisticRegression.java,
> > train() is:
> >
> >        double newValue = beta.getQuick(i, j) + gradientBase *
> learningRate
> > * perTermLearningRate(j) * instance.get(j);
> >
> > It looks to me like the update to beta is ascending the gradient 
> > (hence
> the
> > addition sign instead of minus). Could you help me understand where 
> > my thinking is going wrong?
> >
> > Thanks,
> >
> > Dave
>
>

Reply via email to