[ 
https://issues.apache.org/jira/browse/SYSTEMML-1224?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Deron Eriksson resolved SYSTEMML-1224.
--------------------------------------
       Resolution: Fixed
    Fix Version/s: SystemML 0.13

Fixed by [PR369|https://github.com/apache/incubator-systemml/pull/369].

> Migrate vector and labeledpoint classes from mllib to ml
> --------------------------------------------------------
>
>                 Key: SYSTEMML-1224
>                 URL: https://issues.apache.org/jira/browse/SYSTEMML-1224
>             Project: SystemML
>          Issue Type: Task
>          Components: APIs, Runtime
>    Affects Versions: SystemML 0.13
>            Reporter: Deron Eriksson
>            Assignee: Deron Eriksson
>             Fix For: SystemML 0.13
>
>
> For Spark 2, execution of test_mllearn_df.py gives SparseVector to Vector 
> error:
> {code}
> spark-submit --driver-class-path $SYSTEMML_HOME/SystemML.jar 
> test_mllearn_df.py
> {code}
> generates:
> {code}
> Py4JJavaError: An error occurred while calling o206.fit.
> : org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 
> in stage 2.0 failed 1 times, most recent failure: Lost task 1.0 in stage 2.0 
> (TID 17, localhost, executor driver): java.lang.ClassCastException: 
> org.apache.spark.ml.linalg.SparseVector cannot be cast to 
> org.apache.spark.mllib.linalg.Vector
>       at 
> org.apache.sysml.runtime.instructions.spark.utils.RDDConverterUtils.countNnz(RDDConverterUtils.java:314)
>       at 
> org.apache.sysml.runtime.instructions.spark.utils.RDDConverterUtils.access$400(RDDConverterUtils.java:71)
>       at 
> org.apache.sysml.runtime.instructions.spark.utils.RDDConverterUtils$DataFrameAnalysisFunction.call(RDDConverterUtils.java:940)
>       at 
> org.apache.sysml.runtime.instructions.spark.utils.RDDConverterUtils$DataFrameAnalysisFunction.call(RDDConverterUtils.java:921)
>       at 
> org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1.apply(JavaPairRDD.scala:1040)
>       at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
>       at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1762)
> {code}
> This can most likely be fixed by migrating relevant classes (typically going 
> from mllib package to ml package).



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to