How about editing the various chunks of code concerned to add the option to
scale the parameters, and set it by default to NOT scale? This would make
what happens clear without the redundancy Andreas mentioned, and would add
more convenience to the user shall they want to scale their data.

On Tue, Oct 17, 2017 at 8:45 AM, <[email protected]> wrote:

> Send scikit-learn mailing list submissions to
>         [email protected]
>
> To subscribe or unsubscribe via the World Wide Web, visit
>         https://mail.python.org/mailman/listinfo/scikit-learn
> or, via email, send a message with subject or body 'help' to
>         [email protected]
>
> You can reach the person managing the list at
>         [email protected]
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of scikit-learn digest..."
>
>
> Today's Topics:
>
>    1. Re: Unclear help file about sklearn.decomposition.pca (Raphael C)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Tue, 17 Oct 2017 16:44:55 +0100
> From: Raphael C <[email protected]>
> To: Scikit-learn mailing list <[email protected]>
> Subject: Re: [scikit-learn] Unclear help file about
>         sklearn.decomposition.pca
> Message-ID:
>         <CAFHc1QZigBoA0erY2hwJht2kiAenp=QTQDSb3uc0Uzs5SoEC7Q@mail.
> gmail.com>
> Content-Type: text/plain; charset="UTF-8"
>
> How about including the scaling that people might want to use in the
> User Guide examples?
>
> Raphael
>
> On 17 October 2017 at 16:40, Andreas Mueller <[email protected]> wrote:
> > In general scikit-learn avoids automatic preprocessing.
> > That's a convention to give the user more control and decrease surprising
> > behavior (ostensibly).
> > So scikit-learn will usually do what the algorithm is supposed to do, and
> > nothing more.
> >
> > I'm not sure what the best way do document this is, as this has come up
> with
> > different models.
> > For example the R wrapper of libsvm does automatic scaling, while we
> apply
> > the SVM.
> >
> > We could add "this model does not do any automatic preprocessing" to all
> > docstrings, but that seems
> > a bit redundant. We could add it to
> > https://github.com/scikit-learn/scikit-learn/pull/9517, but
> > that is probably not where you would have looked.
> >
> > Other suggestions welcome.
> >
> >
> > On 10/16/2017 03:29 PM, Ismael Lemhadri wrote:
> >
> > Thank you all for your feedback.
> > The initial problem I came with wasnt the definition of PCA but what the
> > sklearn method does. In practice I would always make sure the data is
> both
> > centered and scaled before performing PCA. This is the recommended method
> > because without scaling, the biggest direction could wrongly seem to
> explain
> > a huge fraction of the variance.
> > So my point was simply to clarify in the help file and the user guide
> what
> > the PCA class does precisely to leave no unclarity to the reader. Moving
> > forward I have now submitted a pull request on github as initially
> suggested
> > by Roman on this thread.
> > Best,
> > Ismael
> >
> > On Mon, 16 Oct 2017 at 11:49 AM, <[email protected]>
> wrote:
> >>
> >> Send scikit-learn mailing list submissions to
> >>         [email protected]
> >>
> >> To subscribe or unsubscribe via the World Wide Web, visit
> >>         https://mail.python.org/mailman/listinfo/scikit-learn
> >> or, via email, send a message with subject or body 'help' to
> >>         [email protected]
> >>
> >> You can reach the person managing the list at
> >>         [email protected]
> >>
> >> When replying, please edit your Subject line so it is more specific
> >> than "Re: Contents of scikit-learn digest..."
> >>
> >>
> >> Today's Topics:
> >>
> >>    1. Re: 1. Re: unclear help file for sklearn.decomposition.pca
> >>       (Andreas Mueller)
> >>    2. Re: 1. Re: unclear help file for sklearn.decomposition.pca
> >>       (Oliver Tomic)
> >>
> >>
> >> ----------------------------------------------------------------------
> >>
> >> Message: 1
> >> Date: Mon, 16 Oct 2017 14:44:51 -0400
> >> From: Andreas Mueller <[email protected]>
> >> To: [email protected]
> >> Subject: Re: [scikit-learn] 1. Re: unclear help file for
> >>         sklearn.decomposition.pca
> >> Message-ID: <[email protected]>
> >> Content-Type: text/plain; charset="utf-8"; Format="flowed"
> >>
> >>
> >>
> >> On 10/16/2017 02:27 PM, Ismael Lemhadri wrote:
> >> > @Andreas Muller:
> >> > My references do not assume centering, e.g.
> >> > http://ufldl.stanford.edu/wiki/index.php/PCA
> >> > any reference?
> >> >
> >> It kinda does but is not very clear about it:
> >>
> >> This data has already been pre-processed so that each of the
> >> features\textstyle x_1and\textstyle x_2have about the same mean (zero)
> >> and variance.
> >>
> >>
> >>
> >> Wikipedia is much clearer:
> >> Consider a datamatrix
> >> <https://en.wikipedia.org/wiki/Matrix_%28mathematics%29>,*X*, with
> >> column-wise zeroempirical mean
> >> <https://en.wikipedia.org/wiki/Empirical_mean>(the sample mean of each
> >> column has been shifted to zero), where each of the/n/rows represents a
> >> different repetition of the experiment, and each of the/p/columns gives
> >> a particular kind of feature (say, the results from a particular
> sensor).
> >> https://en.wikipedia.org/wiki/Principal_component_analysis#Details
> >>
> >> I'm a bit surprised to find that ESL says "The SVD of the centered
> >> matrix X is another way of expressing the principal components of the
> >> variables in X",
> >> so they assume scaling? They don't really have a great treatment of PCA,
> >> though.
> >>
> >> Bishop <http://www.springer.com/us/book/9780387310732> and Murphy
> >> <https://mitpress.mit.edu/books/machine-learning-0> are pretty clear
> >> that they subtract the mean (or assume zero mean) but don't standardize.
> >> -------------- next part --------------
> >> An HTML attachment was scrubbed...
> >> URL:
> >> <http://mail.python.org/pipermail/scikit-learn/
> attachments/20171016/81b3014b/attachment-0001.html>
> >>
> >> ------------------------------
> >>
> >> Message: 2
> >> Date: Mon, 16 Oct 2017 20:48:29 +0200
> >> From: Oliver Tomic <[email protected]>
> >> To: "Scikit-learn mailing list" <[email protected]>
> >> Cc: <[email protected]>
> >> Subject: Re: [scikit-learn] 1. Re: unclear help file for
> >>         sklearn.decomposition.pca
> >> Message-ID: <[email protected]>
> >> Content-Type: text/plain; charset="utf-8"
> >>
> >> Dear Ismael,
> >>
> >>
> >>
> >> PCA should always involve at the least centering, or, if the variables
> are
> >> to contribute equally, scaling. Here is a reference from the scientific
> area
> >> named "chemometrics". In Chemometrics PCA used not only for
> dimensionality
> >> reduction, but also for interpretation of variance by use of scores,
> >> loadings, correlation loadings, etc.
> >>
> >>
> >>
> >> If you scroll down to subsection "Preprocessing" you will find more info
> >> on centering and scaling.
> >>
> >>
> >> http://pubs.rsc.org/en/content/articlehtml/2014/ay/c3ay41907j
> >>
> >>
> >>
> >> best
> >>
> >> Oliver
> >>
> >>
> >>
> >>
> >> ---- On Mon, 16 Oct 2017 20:27:11 +0200 Ismael Lemhadri
> >> &lt;[email protected]&gt; wrote ----
> >>
> >>
> >>
> >>
> >> @Andreas Muller:
> >>
> >> My references do not assume centering, e.g.
> >> http://ufldl.stanford.edu/wiki/index.php/PCA
> >>
> >> any reference?
> >>
> >>
> >>
> >>
> >>
> >>
> >>
> >> On Mon, Oct 16, 2017 at 10:20 AM, &lt;[email protected]
> &gt;
> >> wrote:
> >>
> >> Send scikit-learn mailing list submissions to
> >>
> >>          [email protected]
> >>
> >>
> >>
> >>  To subscribe or unsubscribe via the World Wide Web, visit
> >>
> >>          https://mail.python.org/mailman/listinfo/scikit-learn
> >>
> >>  or, via email, send a message with subject or body 'help' to
> >>
> >>          [email protected]
> >>
> >>
> >>
> >>  You can reach the person managing the list at
> >>
> >>          [email protected]
> >>
> >>
> >>
> >>  When replying, please edit your Subject line so it is more specific
> >>
> >>  than "Re: Contents of scikit-learn digest..."
> >>
> >>
> >>
> >>
> >>
> >>  Today's Topics:
> >>
> >>
> >>
> >>     1. Re: unclear help file for sklearn.decomposition.pca
> >>
> >>        (Andreas Mueller)
> >>
> >>
> >>
> >>
> >>
> >>  ----------------------------------------------------------------------
> >>
> >>
> >>
> >>  Message: 1
> >>
> >>  Date: Mon, 16 Oct 2017 13:19:57 -0400
> >>
> >>  From: Andreas Mueller &lt;[email protected]&gt;
> >>
> >>  To: [email protected]
> >>
> >>  Subject: Re: [scikit-learn] unclear help file for
> >>
> >>          sklearn.decomposition.pca
> >>
> >>  Message-ID: &lt;[email protected]&gt;
> >>
> >>  Content-Type: text/plain; charset="utf-8"; Format="flowed"
> >>
> >>
> >>
> >>  The definition of PCA has a centering step, but no scaling step.
> >>
> >>
> >>
> >>  On 10/16/2017 11:16 AM, Ismael Lemhadri wrote:
> >>
> >>  &gt; Dear Roman,
> >>
> >>  &gt; My concern is actually not about not mentioning the scaling but
> >> about
> >>
> >>  &gt; not mentioning the centering.
> >>
> >>  &gt; That is, the sklearn PCA removes the mean but it does not mention
> it
> >>
> >>  &gt; in the help file.
> >>
> >>  &gt; This was quite messy for me to debug as I expected it to either:
> 1/
> >>
> >>  &gt; center and scale simultaneously or / not scale and not center
> >> either.
> >>
> >>  &gt; It would be beneficial to explicit the behavior in the help file
> in
> >> my
> >>
> >>  &gt; opinion.
> >>
> >>  &gt; Ismael
> >>
> >>  &gt;
> >>
> >>  &gt; On Mon, Oct 16, 2017 at 8:02 AM, &lt;scikit-learn-request@
> python.org
> >>
> >>  &gt; &lt;mailto:[email protected]&gt;&gt; wrote:
> >>
> >>  &gt;
> >>
> >>  &gt;     Send scikit-learn mailing list submissions to
> >>
> >>  &gt;     [email protected] &lt;mailto:[email protected]
> &gt;
> >>
> >>  &gt;
> >>
> >>  &gt;     To subscribe or unsubscribe via the World Wide Web, visit
> >>
> >>  &gt;     https://mail.python.org/mailman/listinfo/scikit-learn
> >>
> >>  &gt;     &lt;https://mail.python.org/mailman/listinfo/scikit-learn&gt;
> >>
> >>  &gt;     or, via email, send a message with subject or body 'help' to
> >>
> >>  &gt;     [email protected]
> >>
> >>  &gt;     &lt;mailto:[email protected]&gt;
> >>
> >>  &gt;
> >>
> >>  &gt;     You can reach the person managing the list at
> >>
> >>  &gt;     [email protected]
> >> &lt;mailto:[email protected]&gt;
> >>
> >>  &gt;
> >>
> >>  &gt;     When replying, please edit your Subject line so it is more
> >> specific
> >>
> >>  &gt;     than "Re: Contents of scikit-learn digest..."
> >>
> >>  &gt;
> >>
> >>  &gt;
> >>
> >>  &gt;     Today's Topics:
> >>
> >>  &gt;
> >>
> >>  &gt;     ? ?1. unclear help file for sklearn.decomposition.pca (Ismael
> >>
> >>  &gt;     Lemhadri)
> >>
> >>  &gt;     ? ?2. Re: unclear help file for sklearn.decomposition.pca
> >>
> >>  &gt;     ? ? ? (Roman Yurchak)
> >>
> >>  &gt;     ? ?3. Question about LDA's coef_ attribute (Serafeim Loukas)
> >>
> >>  &gt;     ? ?4. Re: Question about LDA's coef_ attribute (Alexandre
> >> Gramfort)
> >>
> >>  &gt;     ? ?5. Re: Question about LDA's coef_ attribute (Serafeim
> Loukas)
> >>
> >>  &gt;
> >>
> >>  &gt;
> >>
> >>  &gt;
> >> ----------------------------------------------------------------------
> >>
> >>  &gt;
> >>
> >>  &gt;     Message: 1
> >>
> >>  &gt;     Date: Sun, 15 Oct 2017 18:42:56 -0700
> >>
> >>  &gt;     From: Ismael Lemhadri &lt;[email protected]
> >>
> >>  &gt;     &lt;mailto:[email protected]&gt;&gt;
> >>
> >>  &gt;     To: [email protected]
> >> &lt;mailto:[email protected]&gt;
> >>
> >>  &gt;     Subject: [scikit-learn] unclear help file for
> >>
> >>  &gt;     ? ? ? ? sklearn.decomposition.pca
> >>
> >>  &gt;     Message-ID:
> >>
> >>  &gt;     ? ? ? ?
> >>
> >>  &gt;
> >> &lt;CANpSPFTgv+Oz7f97dandmrBBayqf_o9w=18okhcfn0u5dnz...@mail.gmail.com
> >>
> >>  &gt;     &lt;mailto:18okhcfn0u5dnzj%[email protected]&gt;&gt;
> >>
> >>  &gt;     Content-Type: text/plain; charset="utf-8"
> >>
> >>  &gt;
> >>
> >>  &gt;     Dear all,
> >>
> >>  &gt;     The help file for the PCA class is unclear about the
> >> preprocessing
> >>
> >>  &gt;     performed to the data.
> >>
> >>  &gt;     You can check on line 410 here:
> >>
> >>  &gt;
> >> https://github.com/scikit-learn/scikit-learn/blob/ef5cb84a/sklearn/
> >>
> >>  &gt;     decomposition/pca.py#L410
> >>
> >>  &gt;
> >> &lt;https://github.com/scikit-learn/scikit-learn/blob/
> ef5cb84a/sklearn/%0Adecomposition/pca.py#L410&gt;
> >>
> >>  &gt;     that the matrix is centered but NOT scaled, before performing
> >> the
> >>
> >>  &gt;     singular
> >>
> >>  &gt;     value decomposition.
> >>
> >>  &gt;     However, the help files do not make any mention of it.
> >>
> >>  &gt;     This is unclear for someone who, like me, just wanted to
> compare
> >>
> >>  &gt;     that the
> >>
> >>  &gt;     PCA and np.linalg.svd give the same results. In academic
> >> settings,
> >>
> >>  &gt;     students
> >>
> >>  &gt;     are often asked to compare different methods and to check that
> >>
> >>  &gt;     they yield
> >>
> >>  &gt;     the same results. I expect that many students have confronted
> >> this
> >>
> >>  &gt;     problem
> >>
> >>  &gt;     before...
> >>
> >>  &gt;     Best,
> >>
> >>  &gt;     Ismael Lemhadri
> >>
> >>  &gt;     -------------- next part --------------
> >>
> >>  &gt;     An HTML attachment was scrubbed...
> >>
> >>  &gt;     URL:
> >>
> >>  &gt;
> >> &lt;http://mail.python.org/pipermail/scikit-learn/
> attachments/20171015/c465bde7/attachment-0001.html
> >>
> >>  &gt;
> >> &lt;http://mail.python.org/pipermail/scikit-learn/
> attachments/20171015/c465bde7/attachment-0001.html&gt;&gt;
> >>
> >>  &gt;
> >>
> >>  &gt;     ------------------------------
> >>
> >>  &gt;
> >>
> >>  &gt;     Message: 2
> >>
> >>  &gt;     Date: Mon, 16 Oct 2017 15:16:45 +0200
> >>
> >>  &gt;     From: Roman Yurchak &lt;[email protected]
> >>
> >>  &gt;     &lt;mailto:[email protected]&gt;&gt;
> >>
> >>  &gt;     To: Scikit-learn mailing list &lt;[email protected]
> >>
> >>  &gt;     &lt;mailto:[email protected]&gt;&gt;
> >>
> >>  &gt;     Subject: Re: [scikit-learn] unclear help file for
> >>
> >>  &gt;     ? ? ? ? sklearn.decomposition.pca
> >>
> >>  &gt;     Message-ID: &lt;b2abdcfd-4736-929e-6304-
> [email protected]
> >>
> >>  &gt;
> >> &lt;mailto:[email protected]&gt;&gt;
> >>
> >>  &gt;     Content-Type: text/plain; charset=utf-8; format=flowed
> >>
> >>  &gt;
> >>
> >>  &gt;     Ismael,
> >>
> >>  &gt;
> >>
> >>  &gt;     as far as I saw the sklearn.decomposition.PCA doesn't mention
> >>
> >>  &gt;     scaling at
> >>
> >>  &gt;     all (except for the whiten parameter which is
> >> post-transformation
> >>
> >>  &gt;     scaling).
> >>
> >>  &gt;
> >>
> >>  &gt;     So since it doesn't mention it, it makes sense that it doesn't
> >> do any
> >>
> >>  &gt;     scaling of the input. Same as np.linalg.svd.
> >>
> >>  &gt;
> >>
> >>  &gt;     You can verify that PCA and np.linalg.svd yield the same
> >> results, with
> >>
> >>  &gt;
> >>
> >>  &gt;     ```
> >>
> >>  &gt;     ?&gt;&gt;&gt; import numpy as np
> >>
> >>  &gt;     ?&gt;&gt;&gt; from sklearn.decomposition import PCA
> >>
> >>  &gt;     ?&gt;&gt;&gt; import numpy.linalg
> >>
> >>  &gt;     ?&gt;&gt;&gt; X = np.random.RandomState(42).rand(10, 4)
> >>
> >>  &gt;     ?&gt;&gt;&gt; n_components = 2
> >>
> >>  &gt;     ?&gt;&gt;&gt; PCA(n_components,
> >> svd_solver='full').fit_transform(X)
> >>
> >>  &gt;     ```
> >>
> >>  &gt;
> >>
> >>  &gt;     and
> >>
> >>  &gt;
> >>
> >>  &gt;     ```
> >>
> >>  &gt;     ?&gt;&gt;&gt; U, s, V = np.linalg.svd(X - X.mean(axis=0),
> >> full_matrices=False)
> >>
> >>  &gt;     ?&gt;&gt;&gt; (X - X.mean(axis=0)).dot(V[:n_components].T)
> >>
> >>  &gt;     ```
> >>
> >>  &gt;
> >>
> >>  &gt;     --
> >>
> >>  &gt;     Roman
> >>
> >>  &gt;
> >>
> >>  &gt;     On 16/10/17 03:42, Ismael Lemhadri wrote:
> >>
> >>  &gt;     &gt; Dear all,
> >>
> >>  &gt;     &gt; The help file for the PCA class is unclear about the
> >> preprocessing
> >>
> >>  &gt;     &gt; performed to the data.
> >>
> >>  &gt;     &gt; You can check on line 410 here:
> >>
> >>  &gt;     &gt;
> >>
> >>  &gt;
> >> https://github.com/scikit-learn/scikit-learn/blob/ef5cb84a/sklearn/
> decomposition/pca.py#L410
> >>
> >>  &gt;
> >> &lt;https://github.com/scikit-learn/scikit-learn/blob/ef5cb84a/sklearn/
> decomposition/pca.py#L410&gt;
> >>
> >>  &gt;     &gt;
> >>
> >>  &gt;
> >> &lt;https://github.com/scikit-learn/scikit-learn/blob/ef5cb84a/sklearn/
> decomposition/pca.py#L410
> >>
> >>  &gt;
> >> &lt;https://github.com/scikit-learn/scikit-learn/blob/ef5cb84a/sklearn/
> decomposition/pca.py#L410&gt;&gt;
> >>
> >>  &gt;     &gt; that the matrix is centered but NOT scaled, before
> >> performing the
> >>
> >>  &gt;     &gt; singular value decomposition.
> >>
> >>  &gt;     &gt; However, the help files do not make any mention of it.
> >>
> >>  &gt;     &gt; This is unclear for someone who, like me, just wanted to
> >> compare
> >>
> >>  &gt;     that
> >>
> >>  &gt;     &gt; the PCA and np.linalg.svd give the same results. In
> >> academic
> >>
> >>  &gt;     settings,
> >>
> >>  &gt;     &gt; students are often asked to compare different methods and
> >> to
> >>
> >>  &gt;     check that
> >>
> >>  &gt;     &gt; they yield the same results. I expect that many students
> >> have
> >>
> >>  &gt;     confronted
> >>
> >>  &gt;     &gt; this problem before...
> >>
> >>  &gt;     &gt; Best,
> >>
> >>  &gt;     &gt; Ismael Lemhadri
> >>
> >>  &gt;     &gt;
> >>
> >>  &gt;     &gt;
> >>
> >>  &gt;     &gt; _______________________________________________
> >>
> >>  &gt;     &gt; scikit-learn mailing list
> >>
> >>  &gt;     &gt; [email protected]
> >> &lt;mailto:[email protected]&gt;
> >>
> >>  &gt;     &gt; https://mail.python.org/mailman/listinfo/scikit-learn
> >>
> >>  &gt;     &lt;https://mail.python.org/mailman/listinfo/scikit-learn&gt;
> >>
> >>  &gt;     &gt;
> >>
> >>  &gt;
> >>
> >>  &gt;
> >>
> >>  &gt;
> >>
> >>  &gt;     ------------------------------
> >>
> >>  &gt;
> >>
> >>  &gt;     Message: 3
> >>
> >>  &gt;     Date: Mon, 16 Oct 2017 15:27:48 +0200
> >>
> >>  &gt;     From: Serafeim Loukas &lt;[email protected]
> >> &lt;mailto:[email protected]&gt;&gt;
> >>
> >>  &gt;     To: [email protected]
> >> &lt;mailto:[email protected]&gt;
> >>
> >>  &gt;     Subject: [scikit-learn] Question about LDA's coef_ attribute
> >>
> >>  &gt;     Message-ID: &lt;58C6D0DA-9DE5-4EF5-97C1-
> [email protected]
> >>
> >>  &gt;
> >> &lt;mailto:[email protected]&gt;&gt;
> >>
> >>  &gt;     Content-Type: text/plain; charset="us-ascii"
> >>
> >>  &gt;
> >>
> >>  &gt;     Dear Scikit-learn community,
> >>
> >>  &gt;
> >>
> >>  &gt;     Since the documentation of the LDA
> >>
> >>  &gt;
> >> (http://scikit-learn.org/stable/modules/generated/
> sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html
> >>
> >>  &gt;
> >> &lt;http://scikit-learn.org/stable/modules/generated/
> sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html&gt;
> >>
> >>  &gt;
> >> &lt;http://scikit-learn.org/stable/modules/generated/
> sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html
> >>
> >>  &gt;
> >> &lt;http://scikit-learn.org/stable/modules/generated/
> sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html&gt;&gt;)
> >>
> >>  &gt;     is not so clear, I would like to ask if the lda.coef_
> attribute
> >>
> >>  &gt;     stores the eigenvectors from the SVD decomposition.
> >>
> >>  &gt;
> >>
> >>  &gt;     Thank you in advance,
> >>
> >>  &gt;     Serafeim
> >>
> >>  &gt;     -------------- next part --------------
> >>
> >>  &gt;     An HTML attachment was scrubbed...
> >>
> >>  &gt;     URL:
> >>
> >>  &gt;
> >> &lt;http://mail.python.org/pipermail/scikit-learn/
> attachments/20171016/4263df5c/attachment-0001.html
> >>
> >>  &gt;
> >> &lt;http://mail.python.org/pipermail/scikit-learn/
> attachments/20171016/4263df5c/attachment-0001.html&gt;&gt;
> >>
> >>  &gt;
> >>
> >>  &gt;     ------------------------------
> >>
> >>  &gt;
> >>
> >>  &gt;     Message: 4
> >>
> >>  &gt;     Date: Mon, 16 Oct 2017 16:57:52 +0200
> >>
> >>  &gt;     From: Alexandre Gramfort &lt;[email protected]
> >>
> >>  &gt;     &lt;mailto:[email protected]&gt;&gt;
> >>
> >>  &gt;     To: Scikit-learn mailing list &lt;[email protected]
> >>
> >>  &gt;     &lt;mailto:[email protected]&gt;&gt;
> >>
> >>  &gt;     Subject: Re: [scikit-learn] Question about LDA's coef_
> attribute
> >>
> >>  &gt;     Message-ID:
> >>
> >>  &gt;     ? ? ? ?
> >>
> >>  &gt;
> >> &lt;cadeotzricoqhuhjmmw2z14cqffeqyndyoxn-ogkavtmq7v0...@mail.gmail.com
> >>
> >>  &gt;
> >> &lt;mailto:CADeotZricOQhuHJMmW2Z14cqffEQyndYoxn-
> [email protected]&gt;&gt;
> >>
> >>  &gt;     Content-Type: text/plain; charset="UTF-8"
> >>
> >>  &gt;
> >>
> >>  &gt;     no it stores the direction of the decision function to match
> the
> >>
> >>  &gt;     API of
> >>
> >>  &gt;     linear models.
> >>
> >>  &gt;
> >>
> >>  &gt;     HTH
> >>
> >>  &gt;     Alex
> >>
> >>  &gt;
> >>
> >>  &gt;     On Mon, Oct 16, 2017 at 3:27 PM, Serafeim Loukas
> >>
> >>  &gt;     &lt;[email protected] &lt;mailto:[email protected]&gt;&gt;
> >> wrote:
> >>
> >>  &gt;     &gt; Dear Scikit-learn community,
> >>
> >>  &gt;     &gt;
> >>
> >>  &gt;     &gt; Since the documentation of the LDA
> >>
> >>  &gt;     &gt;
> >>
> >>  &gt;
> >> (http://scikit-learn.org/stable/modules/generated/
> sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html
> >>
> >>  &gt;
> >> &lt;http://scikit-learn.org/stable/modules/generated/
> sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html&gt;)
> >>
> >>  &gt;     &gt; is not so clear, I would like to ask if the lda.coef_
> >> attribute
> >>
> >>  &gt;     stores the
> >>
> >>  &gt;     &gt; eigenvectors from the SVD decomposition.
> >>
> >>  &gt;     &gt;
> >>
> >>  &gt;     &gt; Thank you in advance,
> >>
> >>  &gt;     &gt; Serafeim
> >>
> >>  &gt;     &gt;
> >>
> >>  &gt;     &gt; _______________________________________________
> >>
> >>  &gt;     &gt; scikit-learn mailing list
> >>
> >>  &gt;     &gt; [email protected]
> >> &lt;mailto:[email protected]&gt;
> >>
> >>  &gt;     &gt; https://mail.python.org/mailman/listinfo/scikit-learn
> >>
> >>  &gt;     &lt;https://mail.python.org/mailman/listinfo/scikit-learn&gt;
> >>
> >>  &gt;     &gt;
> >>
> >>  &gt;
> >>
> >>  &gt;
> >>
> >>  &gt;     ------------------------------
> >>
> >>  &gt;
> >>
> >>  &gt;     Message: 5
> >>
> >>  &gt;     Date: Mon, 16 Oct 2017 17:02:46 +0200
> >>
> >>  &gt;     From: Serafeim Loukas &lt;[email protected]
> >> &lt;mailto:[email protected]&gt;&gt;
> >>
> >>  &gt;     To: Scikit-learn mailing list &lt;[email protected]
> >>
> >>  &gt;     &lt;mailto:[email protected]&gt;&gt;
> >>
> >>  &gt;     Subject: Re: [scikit-learn] Question about LDA's coef_
> attribute
> >>
> >>  &gt;     Message-ID: &lt;413210D2-56AE-41A4-873F-
> [email protected]
> >>
> >>  &gt;
> >> &lt;mailto:[email protected]&gt;&gt;
> >>
> >>  &gt;     Content-Type: text/plain; charset="us-ascii"
> >>
> >>  &gt;
> >>
> >>  &gt;     Dear Alex,
> >>
> >>  &gt;
> >>
> >>  &gt;     Thank you for the prompt response.
> >>
> >>  &gt;
> >>
> >>  &gt;     Are the eigenvectors stored in some variable ?
> >>
> >>  &gt;     Does the lda.scalings_ attribute contain the eigenvectors ?
> >>
> >>  &gt;
> >>
> >>  &gt;     Best,
> >>
> >>  &gt;     Serafeim
> >>
> >>  &gt;
> >>
> >>  &gt;     &gt; On 16 Oct 2017, at 16:57, Alexandre Gramfort
> >>
> >>  &gt;     &lt;[email protected]
> >> &lt;mailto:[email protected]&gt;&gt;
> >>
> >>  &gt;     wrote:
> >>
> >>  &gt;     &gt;
> >>
> >>  &gt;     &gt; no it stores the direction of the decision function to
> >> match the
> >>
> >>  &gt;     API of
> >>
> >>  &gt;     &gt; linear models.
> >>
> >>  &gt;     &gt;
> >>
> >>  &gt;     &gt; HTH
> >>
> >>  &gt;     &gt; Alex
> >>
> >>  &gt;     &gt;
> >>
> >>  &gt;     &gt; On Mon, Oct 16, 2017 at 3:27 PM, Serafeim Loukas
> >>
> >>  &gt;     &lt;[email protected] &lt;mailto:[email protected]&gt;&gt;
> >> wrote:
> >>
> >>  &gt;     &gt;&gt; Dear Scikit-learn community,
> >>
> >>  &gt;     &gt;&gt;
> >>
> >>  &gt;     &gt;&gt; Since the documentation of the LDA
> >>
> >>  &gt;     &gt;&gt;
> >>
> >>  &gt;
> >> (http://scikit-learn.org/stable/modules/generated/
> sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html
> >>
> >>  &gt;
> >> &lt;http://scikit-learn.org/stable/modules/generated/
> sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html&gt;)
> >>
> >>  &gt;     &gt;&gt; is not so clear, I would like to ask if the lda.coef_
> >> attribute
> >>
> >>  &gt;     stores the
> >>
> >>  &gt;     &gt;&gt; eigenvectors from the SVD decomposition.
> >>
> >>  &gt;     &gt;&gt;
> >>
> >>  &gt;     &gt;&gt; Thank you in advance,
> >>
> >>  &gt;     &gt;&gt; Serafeim
> >>
> >>  &gt;     &gt;&gt;
> >>
> >>  &gt;     &gt;&gt; _______________________________________________
> >>
> >>  &gt;     &gt;&gt; scikit-learn mailing list
> >>
> >>  &gt;     &gt;&gt; [email protected]
> >> &lt;mailto:[email protected]&gt;
> >>
> >>  &gt;     &gt;&gt; https://mail.python.org/
> mailman/listinfo/scikit-learn
> >>
> >>  &gt;     &lt;https://mail.python.org/mailman/listinfo/scikit-learn&gt;
> >>
> >>  &gt;     &gt;&gt;
> >>
> >>  &gt;     &gt; _______________________________________________
> >>
> >>  &gt;     &gt; scikit-learn mailing list
> >>
> >>  &gt;     &gt; [email protected]
> >> &lt;mailto:[email protected]&gt;
> >>
> >>  &gt;     &gt; https://mail.python.org/mailman/listinfo/scikit-learn
> >>
> >>  &gt;     &lt;https://mail.python.org/mailman/listinfo/scikit-learn&gt;
> >>
> >>  &gt;
> >>
> >>  &gt;     -------------- next part --------------
> >>
> >>  &gt;     An HTML attachment was scrubbed...
> >>
> >>  &gt;     URL:
> >>
> >>  &gt;
> >> &lt;http://mail.python.org/pipermail/scikit-learn/
> attachments/20171016/505c7da3/attachment.html
> >>
> >>  &gt;
> >> &lt;http://mail.python.org/pipermail/scikit-learn/
> attachments/20171016/505c7da3/attachment.html&gt;&gt;
> >>
> >>  &gt;
> >>
> >>  &gt;     ------------------------------
> >>
> >>  &gt;
> >>
> >>  &gt;     Subject: Digest Footer
> >>
> >>  &gt;
> >>
> >>  &gt;     _______________________________________________
> >>
> >>  &gt;     scikit-learn mailing list
> >>
> >>  &gt;     [email protected] &lt;mailto:[email protected]
> &gt;
> >>
> >>  &gt;     https://mail.python.org/mailman/listinfo/scikit-learn
> >>
> >>  &gt;     &lt;https://mail.python.org/mailman/listinfo/scikit-learn&gt;
> >>
> >>  &gt;
> >>
> >>  &gt;
> >>
> >>  &gt;     ------------------------------
> >>
> >>  &gt;
> >>
> >>  &gt;     End of scikit-learn Digest, Vol 19, Issue 25
> >>
> >>  &gt;     ********************************************
> >>
> >>  &gt;
> >>
> >>  &gt;
> >>
> >>  &gt;
> >>
> >>  &gt;
> >>
> >>  &gt; _______________________________________________
> >>
> >>  &gt; scikit-learn mailing list
> >>
> >>  &gt; [email protected]
> >>
> >>  &gt; https://mail.python.org/mailman/listinfo/scikit-learn
> >>
> >>
> >>
> >>  -------------- next part --------------
> >>
> >>  An HTML attachment was scrubbed...
> >>
> >>  URL:
> >> &lt;http://mail.python.org/pipermail/scikit-learn/
> attachments/20171016/f47e63a9/attachment.html&gt;
> >>
> >>
> >>
> >>  ------------------------------
> >>
> >>
> >>
> >>  Subject: Digest Footer
> >>
> >>
> >>
> >>  _______________________________________________
> >>
> >>  scikit-learn mailing list
> >>
> >>  [email protected]
> >>
> >>  https://mail.python.org/mailman/listinfo/scikit-learn
> >>
> >>
> >>
> >>
> >>
> >>  ------------------------------
> >>
> >>
> >>
> >>  End of scikit-learn Digest, Vol 19, Issue 28
> >>
> >>  ********************************************
> >>
> >>
> >>
> >>
> >>
> >>
> >> _______________________________________________
> >>
> >> scikit-learn mailing list
> >>
> >> [email protected]
> >>
> >> https://mail.python.org/mailman/listinfo/scikit-learn
> >>
> >>
> >>
> >>
> >>
> >>
> >> -------------- next part --------------
> >> An HTML attachment was scrubbed...
> >> URL:
> >> <http://mail.python.org/pipermail/scikit-learn/
> attachments/20171016/620a9401/attachment.html>
> >>
> >> ------------------------------
> >>
> >> Subject: Digest Footer
> >>
> >> _______________________________________________
> >> scikit-learn mailing list
> >> [email protected]
> >> https://mail.python.org/mailman/listinfo/scikit-learn
> >>
> >>
> >> ------------------------------
> >>
> >> End of scikit-learn Digest, Vol 19, Issue 31
> >> ********************************************
> >
> > --
> >
> > Sent from a mobile phone and may contain errors
> >
> >
> > _______________________________________________
> > scikit-learn mailing list
> > [email protected]
> > https://mail.python.org/mailman/listinfo/scikit-learn
> >
> >
> >
> > _______________________________________________
> > scikit-learn mailing list
> > [email protected]
> > https://mail.python.org/mailman/listinfo/scikit-learn
> >
>
>
> ------------------------------
>
> Subject: Digest Footer
>
> _______________________________________________
> scikit-learn mailing list
> [email protected]
> https://mail.python.org/mailman/listinfo/scikit-learn
>
>
> ------------------------------
>
> End of scikit-learn Digest, Vol 19, Issue 37
> ********************************************
>
_______________________________________________
scikit-learn mailing list
[email protected]
https://mail.python.org/mailman/listinfo/scikit-learn

Reply via email to