[jira] [Commented] (SPARK-7675) PySpark spark.ml Params type conversions

2015-11-09 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7675?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14997814#comment-14997814
 ] 

Apache Spark commented on SPARK-7675:
-

User 'holdenk' has created a pull request for this issue:
https://github.com/apache/spark/pull/9581

> PySpark spark.ml Params type conversions
> 
>
> Key: SPARK-7675
> URL: https://issues.apache.org/jira/browse/SPARK-7675
> Project: Spark
>  Issue Type: Improvement
>  Components: ML, PySpark
>Reporter: Joseph K. Bradley
>Priority: Minor
>
> Currently, PySpark wrappers for spark.ml Scala classes are brittle when 
> accepting Param types.  E.g., Normalizer's "p" param cannot be set to "2" (an 
> integer); it must be set to "2.0" (a float).  Fixing this is not trivial 
> since there does not appear to be a natural place to insert the conversion 
> before Python wrappers call Java's Params setter method.
> A possible fix will be to include a method "_checkType" to PySpark's Param 
> class which checks the type, prints an error if needed, and converts types 
> when relevant (e.g., int to float, or scipy matrix to array).  The Java 
> wrapper method which copies params to Scala can call this method when 
> available.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-7675) PySpark spark.ml Params type conversions

2015-11-06 Thread holdenk (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7675?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14994212#comment-14994212
 ] 

holdenk commented on SPARK-7675:


I'll give this a shot since I've been doing some other work in the intersection 
of PySpark and ML

> PySpark spark.ml Params type conversions
> 
>
> Key: SPARK-7675
> URL: https://issues.apache.org/jira/browse/SPARK-7675
> Project: Spark
>  Issue Type: Improvement
>  Components: ML, PySpark
>Reporter: Joseph K. Bradley
>Priority: Minor
>
> Currently, PySpark wrappers for spark.ml Scala classes are brittle when 
> accepting Param types.  E.g., Normalizer's "p" param cannot be set to "2" (an 
> integer); it must be set to "2.0" (a float).  Fixing this is not trivial 
> since there does not appear to be a natural place to insert the conversion 
> before Python wrappers call Java's Params setter method.
> A possible fix will be to include a method "_checkType" to PySpark's Param 
> class which checks the type, prints an error if needed, and converts types 
> when relevant (e.g., int to float, or scipy matrix to array).  The Java 
> wrapper method which copies params to Scala can call this method when 
> available.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org