[ 
https://issues.apache.org/jira/browse/SPARK-5016?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14306688#comment-14306688
 ] 

Manoj Kumar edited comment on SPARK-5016 at 2/5/15 8:09 AM:
------------------------------------------------------------

Hi, I would like to fix this (since I'm familiar to an extent with this part of 
the code) and maybe we could merge this before the sparseinput issue.

1. As a heuristic, how large should k be?
2. By distribute do you mean, to store samples 
(https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/mllib/clustering/GaussianMixture.scala#L140)
 as a collection using sc.parallelize, so that it can be operated on paraalel 
across k? What role does n_features have?

Thanks.


was (Author: mechcoder):
Hi, I would like to fix this (since I'm familiar to an extent with this part of 
the code) and maybe we could merge this before the sparseinput issue.

1. As a heuristic, how large should k be?
2. By distribute do you mean, to store samples 
(https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/mllib/clustering/GaussianMixture.scala#L140)
 as a collection using sc.parallelize, so that it can be operated on paraalel 
across k.

Thanks.

> GaussianMixtureEM should distribute matrix inverse for large numFeatures, k
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-5016
>                 URL: https://issues.apache.org/jira/browse/SPARK-5016
>             Project: Spark
>          Issue Type: Improvement
>          Components: MLlib
>    Affects Versions: 1.2.0
>            Reporter: Joseph K. Bradley
>
> If numFeatures or k are large, GMM EM should distribute the matrix inverse 
> computation for Gaussian initialization.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to