[jira] [Assigned] (SPARK-13355) Replace GraphImpl.fromExistingRDDs by Graph

2016-02-16 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-13355?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-13355:


Assignee: Xiangrui Meng  (was: Apache Spark)

> Replace GraphImpl.fromExistingRDDs by Graph
> ---
>
> Key: SPARK-13355
> URL: https://issues.apache.org/jira/browse/SPARK-13355
> Project: Spark
>  Issue Type: Bug
>  Components: ML, MLlib
>Affects Versions: 1.3.1, 1.4.1, 1.5.2, 1.6.0, 2.0.0
>Reporter: Xiangrui Meng
>Assignee: Xiangrui Meng
>
> `GraphImpl.fromExistingRDDs` expects preprocessed vertex RDD as input. We 
> call it in LDA without validating this requirement. So it might introduce 
> errors. Replacing it by `Gpaph.apply` would be safer and more proper because 
> it is a public API. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-13355) Replace GraphImpl.fromExistingRDDs by Graph

2016-02-16 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-13355?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-13355:


Assignee: Apache Spark  (was: Xiangrui Meng)

> Replace GraphImpl.fromExistingRDDs by Graph
> ---
>
> Key: SPARK-13355
> URL: https://issues.apache.org/jira/browse/SPARK-13355
> Project: Spark
>  Issue Type: Bug
>  Components: ML, MLlib
>Affects Versions: 1.3.1, 1.4.1, 1.5.2, 1.6.0, 2.0.0
>Reporter: Xiangrui Meng
>Assignee: Apache Spark
>
> `GraphImpl.fromExistingRDDs` expects preprocessed vertex RDD as input. We 
> call it in LDA without validating this requirement. So it might introduce 
> errors. Replacing it by `Gpaph.apply` would be safer and more proper because 
> it is a public API. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org