Re: [scikit-learn] LogisticRegression

2019-06-11 Thread Eric J. Van der Velden
Thanks! Op di 11 jun. 2019 20:48 schreef Andreas Mueller : > > > On 6/11/19 11:47 AM, Eric J. Van der Velden wrote: > > Hi Nicolas, Andrew, > > Thanks! > > I found out that it is the regularization term. Sklearn always has that > term. When I program logistic regression with that term too, with >

Re: [scikit-learn] LogisticRegression

2019-06-11 Thread Andreas Mueller
On 6/11/19 11:47 AM, Eric J. Van der Velden wrote: Hi Nicolas, Andrew, Thanks! I found out that it is the regularization term. Sklearn always has that term. When I program logistic regression with that term too, with \lambda=1, I get exactly the same answer as sklearn, when I look at the p

Re: [scikit-learn] LogisticRegression

2019-06-11 Thread Eric J. Van der Velden
Hi Nicolas, Andrew, Thanks! I found out that it is the regularization term. Sklearn always has that term. When I program logistic regression with that term too, with \lambda=1, I get exactly the same answer as sklearn, when I look at the parameters you gave me. Question is why sklearn always has

Re: [scikit-learn] LogisticRegression

2019-06-11 Thread Andrew Howe
The coef_ attribute of the LogisticRegression object stores the parameters. Andrew <~~~> J. Andrew Howe, PhD LinkedIn Profile ResearchGate Profile Open Researcher and Contributor ID (OR

Re: [scikit-learn] LogisticRegression

2019-06-08 Thread Eric J. Van der Velden
Here I have added what I had programmed. With sklearn's LogisticRegression(), how can I see the parameters it has found after .fit() where the cost is minimal? I use the book of Geron about scikit-learn and tensorflow and on page 137 he trains the model of petal widths. I did the following: i

Re: [scikit-learn] LogisticRegression coef_ greater than n_features?

2019-01-08 Thread Sebastian Raschka
It seems like it's determined by the order in which they occur in the training set. E.g., from sklearn.preprocessing import OneHotEncoder import numpy as np x = np.array([['b'], ['a'], ['b']]) ohe = OneHotEncoder() xt = ohe.fit_transform(x) xt.todense() matrix([[0.,

Re: [scikit-learn] LogisticRegression coef_ greater than n_features?

2019-01-08 Thread pisymbol
Also Sebastian, I have binary classes but they are strings: clf.classes_: array(['American', 'Southwest'], dtype=object) On Tue, Jan 8, 2019 at 9:51 AM pisymbol wrote: > If that is the case, what order are the coefficients in then? > > -aps > > On Tue, Jan 8, 2019 at 12:48 AM Sebastian Rasch

Re: [scikit-learn] LogisticRegression coef_ greater than n_features?

2019-01-08 Thread pisymbol
If that is the case, what order are the coefficients in then? -aps On Tue, Jan 8, 2019 at 12:48 AM Sebastian Raschka wrote: > E.g, if you have a feature with values 'a' , 'b', 'c', then applying the > one hot encoder will transform this into 3 features. > > Best, > Sebastian > > > On Jan 7, 201

Re: [scikit-learn] LogisticRegression coef_ greater than n_features?

2019-01-07 Thread Sebastian Raschka
E.g, if you have a feature with values 'a' , 'b', 'c', then applying the one hot encoder will transform this into 3 features. Best, Sebastian > On Jan 7, 2019, at 11:02 PM, pisymbol wrote: > > > > On Mon, Jan 7, 2019 at 11:50 PM pisymbol wrote: > According to the doc (0.20.2) the coef_ vari

Re: [scikit-learn] LogisticRegression coef_ greater than n_features?

2019-01-07 Thread Sebastian Raschka
Maybe check a) if the actual labels of the training examples don't start at 0 b) if you have gaps, e.g,. if your unique training labels are 0, 1, 4, ..., 23 Best, Sebastian > On Jan 7, 2019, at 10:50 PM, pisymbol wrote: > > According to the doc (0.20.2) the coef_ variables are suppose to be s

Re: [scikit-learn] LogisticRegression coef_ greater than n_features?

2019-01-07 Thread pisymbol
On Mon, Jan 7, 2019 at 11:50 PM pisymbol wrote: > According to the doc (0.20.2) the coef_ variables are suppose to be shape > (1, n_features) for binary classification. Well I created a Pipeline and > performed a GridSearchCV to create a LogisticRegresion model that does > fairly well. However, w