Hi,
thanks for your suggestions. I will try both options.
Best,
David
On Tue, Aug 11, 2020 at 5:39 PM Mainak Jas wrote:
> Hi David,
>
> Michael has great ideas and they might serve your purpose. If not and if
> you are willing to try another software package that is compatible with the
> sciki
Hi David,
Michael has great ideas and they might serve your purpose. If not and if
you are willing to try another software package that is compatible with the
scikit-learn ecosystem, you can look into pyglmnet:
http://glm-tools.github.io/pyglmnet/auto_examples/plot_tikhonov.html#sphx-glr-auto-exa
Hi David,
I am assuming you mean that T acts on w.
If T is invertible, you can absorb it into the design matrix by making a
change of variable v=Tw, w=T^-1 v, and use standard ridge regression for v.
If it is not (e.g. when T is a standard finite difference derivative
operator) then this trick won
Hi,
I was looking at docs for Ridge regression and it states that it minimizes
||y - Xw||^2 + alpha*||w||^2
I would like to minimize the function
||y-Xw||^2 + ||Tx||^2, where T is a matrix, in order to impose certain
properties on the solution vectors, but I haven't found any way to achieve
tha