I literally owe my career in the data space to scikit-learn. It’s not just
a framework but a school of thought regarding predictive modeling.
Super well deserved, folks :)
Schots
Em sáb., 5 de fev. de 2022 às 13:32, Gyro Funch
escreveu:
> On 2022-02-05 04:23 PM, Gael Varoquaux wrote:
> > Hi ev
There is no constraint, that’s the point since nothing limits you to have a
model with crap predictions leading to be worse than to just predict the
target’s mean for every data point.
If you do so —> negative R2.
Best Regards,
Em qui., 12 de ago. de 2021 às 16:21, Samir K Mahajan <
samirkmahaja
I have been using both in time-series classification. I put a exponential
decay in sample_weights AND class weights as a dictionary.
BR/Schots
Em sex., 4 de dez. de 2020 às 12:01, Nicolas Hug
escreveu:
> Basically passing class weights should be equivalent to passing
> per-class-constant sample
You should instantiate LogisticRegression() before fitting.
logreg = LogisticRegression().fit(Xnp,ynp)
[]’s
Maykon Schots
Em dom., 1 de nov. de 2020 às 23:41, The Helmbolds via scikit-learn <
scikit-learn@python.org> escreveu:
> What parentheses?
> Enclosing what?
>
> "You won't find the right