Hi Olivier,
Thanks for the suggestion. The package seems to be handy. I will try that.
Regards,
Mahmood
On Sun, Jan 24, 2021 at 12:55 PM Oliver Tomic via scikit-learn
wrote:
>
> Hi Mahmood,
>
> the information you need is given by the individual explained variance for
> each variable / featu
Hi Mahmood,
the information you need is given by the individual explained variance for each
variable / feature. You get that information from the hoggorm package (Python):
https://github.com/olivertomic/hoggorm
https://hoggorm.readthedocs.io/en/latest/index.html
Here is one of the PCA
Hi
Thanks for the replies. I read about the available functions in the
PCA section. Consider the following code
x = StandardScaler().fit_transform(x)
pca = PCA()
principalComponents = pca.fit_transform(x)
principalDf = pd.DataFrame(data = principalComponents)
loadings = pca.components_
finalDf = p
Hi Mahmood,
There are different pieces of info that you can get from PCA:
1. How important is a given PC to reconstruct the entire dataset -> This
is given by explained_variance_ratio_ as Guillaume suggested
2. What is the contribution of each feature to each PC (remember that a
PC is a line
Hi Mahmood,
I believe your question is answered here:
https://stackoverflow.com/questions/22984335/recovering-features-names-of-explained-variance-ratio-in-pca-with-sklearn
> El 22 ene 2021, a las 10:26, Guillaume Lemaître
> escribió:
>
>
> I am not really understanding the question, sorry
I am not really understanding the question, sorry.
Are you seeking for the `explained_variance_ratio_` attribute that give you
a relative value of the eigenvalues associated to the eigenvectors?
On Fri, 22 Jan 2021 at 10:16, Mahmood Naderan wrote:
> Hi
> I have a question about PCA and that is,
Hi
I have a question about PCA and that is, how we can determine, a
variable, X, is better captured by which factor (principal
component)? For example, maybe one variable has low weight in the
first PC but has a higher weight in the fifth PC.
When I use the PCA from Scikit, I have to manually wor