>From a generalization point of view (test accuracy), the optimal
sparsity support should not matter much though, but it can be helpful
to find a the optimally sparsest solution for either computational
constraints (smaller models with a lower prediction latency) and
interpretation of the weights (
Note that SGD is not very good at optimizing finely with a non-smooth
penalty (e.g. l1 or elasticnet). The future SAGA solver is going to be
much better at finding the optimal sparsity support (although this
support is not guaranteed to be stable across re-sampling of the
training set if the traini
Many thanks.
On Mon, Mar 13, 2017 at 10:08 AM, Sebastian Raschka
wrote:
> Hi, Stuart,
> I think the only way to do that right now would be through the SGD
> classifier, e.g.,
>
> sklearn.linear_model.SGDClassifier(loss='log', penalty='elasticnet' …)
>
> Best,
> Sebastian
>
> > On Mar 13, 2017, a
Hi, Stuart,
I think the only way to do that right now would be through the SGD classifier,
e.g.,
sklearn.linear_model.SGDClassifier(loss='log', penalty='elasticnet' …)
Best,
Sebastian
> On Mar 13, 2017, at 12:57 PM, Stuart Reynolds
> wrote:
>
> Is there an implementation of logistic regress
Perfect. Thanks -- will give it a go.
On Mon, Mar 13, 2017 at 10:04 AM, Jacob Schreiber
wrote:
> Hi Stuart
>
> Take a look at this issue: https://github.com/scikit-learn/scikit-learn/
> issues/2968
>
> On Mon, Mar 13, 2017 at 9:57 AM, Stuart Reynolds <
> [email protected]> wrote:
>
>> Is
Recently, there are some issues/PRs tackling the topic:
https://github.com/scikit-learn/scikit-learn/issues/8288
https://github.com/scikit-learn/scikit-learn/issues/8446
___
scikit-learn mailing list
[email protected]
https://mail.python.org/mailm
Hi Stuart
Take a look at this issue:
https://github.com/scikit-learn/scikit-learn/issues/2968
On Mon, Mar 13, 2017 at 9:57 AM, Stuart Reynolds
wrote:
> Is there an implementation of logistic regression with elastic net
> regularization in scikit?
> (or pointers on implementing this - its seems