[ 
https://issues.apache.org/jira/browse/SPARK-7675?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-7675:
-----------------------------------

    Assignee:     (was: Apache Spark)

> PySpark spark.ml Params type conversions
> ----------------------------------------
>
>                 Key: SPARK-7675
>                 URL: https://issues.apache.org/jira/browse/SPARK-7675
>             Project: Spark
>          Issue Type: Improvement
>          Components: ML, PySpark
>            Reporter: Joseph K. Bradley
>            Priority: Minor
>
> Currently, PySpark wrappers for spark.ml Scala classes are brittle when 
> accepting Param types.  E.g., Normalizer's "p" param cannot be set to "2" (an 
> integer); it must be set to "2.0" (a float).  Fixing this is not trivial 
> since there does not appear to be a natural place to insert the conversion 
> before Python wrappers call Java's Params setter method.
> A possible fix will be to include a method "_checkType" to PySpark's Param 
> class which checks the type, prints an error if needed, and converts types 
> when relevant (e.g., int to float, or scipy matrix to array).  The Java 
> wrapper method which copies params to Scala can call this method when 
> available.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to