E.g, if you have a feature with values 'a' , 'b', 'c', then applying the one
hot encoder will transform this into 3 features.
Best,
Sebastian
> On Jan 7, 2019, at 11:02 PM, pisymbol wrote:
>
>
>
> On Mon, Jan 7, 2019 at 11:50 PM pisymbol wrote:
> According to the doc (0.20.2) the coef_ vari
Maybe check
a) if the actual labels of the training examples don't start at 0
b) if you have gaps, e.g,. if your unique training labels are 0, 1, 4, ..., 23
Best,
Sebastian
> On Jan 7, 2019, at 10:50 PM, pisymbol wrote:
>
> According to the doc (0.20.2) the coef_ variables are suppose to be s
On Mon, Jan 7, 2019 at 11:50 PM pisymbol wrote:
> According to the doc (0.20.2) the coef_ variables are suppose to be shape
> (1, n_features) for binary classification. Well I created a Pipeline and
> performed a GridSearchCV to create a LogisticRegresion model that does
> fairly well. However, w
According to the doc (0.20.2) the coef_ variables are suppose to be shape
(1, n_features) for binary classification. Well I created a Pipeline and
performed a GridSearchCV to create a LogisticRegresion model that does
fairly well. However, when I want to rank feature importance I noticed that
my co
Hi everybody and happy new year,
We let this thread about the sprint die. I hope that this didn't change
people's plans.
So, it seems that the week of Feb 25th is a good week. I'll assume that
it's good for most and start planning from there (if it's not the case,
let me know).
I've started our