I am bit relunctant to support normalize as to be consistent it should be
done everywhere including for example in SVM models. Also normalization
with unit standard deviation is not always what you want to do: if you have
an outlier coefficient it will collapse the good values to 0. hence you might
On Wed, Oct 31, 2012 at 06:10:13PM +0100, Alexandre Gramfort wrote:
> fine with me but do you push the logic further to any linear estimator ?
> For example in Ridge we also have normalize=False by default.
Should normalize be on by default in Ridge, or should it not, based on
your experience of
2012/10/31 Alexandre Gramfort :
> fine with me but do you push the logic further to any linear estimator ?
> For example in Ridge we also have normalize=False by default.
>
> I would say that LassoLars is more the exception than the norm.
Indeed, yet another tricky mission for the Consistency Brig
fine with me but do you push the logic further to any linear estimator ?
For example in Ridge we also have normalize=False by default.
I would say that LassoLars is more the exception than the norm.
Alex
On Wed, Oct 31, 2012 at 11:53 AM, Jaques Grobler
wrote:
> It makes sense to me to make the
It makes sense to me to make the change - however the scikit-learn users
would just
need to be warned about this. Perhaps for now we can just add a warning
that the API
will be changing as to make users well aware (before actually changing the
API)
and that they must manually set it up in the meanw
2012/10/31 Gael Varoquaux :
>
> I want to change this (warning backward compatibility breakage :$ ). I
> want to change Lasso to have normalize=True, because in my experience
> this is a sane behavior. This would imply, for consistency, changing
> ElasticNet to also have normalize=True. We would ha