Ah of course. Great explanation. So I suppose you should see desired
results with lambda = 0, although you don't generally want to set this
to 0.

On Wed, Nov 26, 2014 at 7:53 PM, Xiangrui Meng <[email protected]> wrote:
> The training RMSE may increase due to regularization. Squared loss
> only represents part of the global loss. If you watch the sum of the
> squared loss and the regularization, it should be non-increasing.

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to