[jira] [Commented] (SPARK-15130) PySpark decision tree params should include default values to match Scala

2016-05-04 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-15130?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15271611#comment-15271611
 ] 

Apache Spark commented on SPARK-15130:
--

User 'holdenk' has created a pull request for this issue:
https://github.com/apache/spark/pull/12914

> PySpark decision tree params should include default values to match Scala
> -
>
> Key: SPARK-15130
> URL: https://issues.apache.org/jira/browse/SPARK-15130
> Project: Spark
>  Issue Type: Improvement
>  Components: Documentation, ML, PySpark
>Reporter: holdenk
>Priority: Minor
>
> As part of checking the documentation in SPARK-14813, PySpark decision tree 
> params do not include the default values (unlike the Scala ones). While the 
> existing Scala default values will have been used, this information is likely 
> worth exposing in the docs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-15130) PySpark decision tree params should include default values to match Scala

2016-05-04 Thread Holden Karau (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-15130?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15271585#comment-15271585
 ] 

Holden Karau commented on SPARK-15130:
--

I mean that the pydocs should include what the default value is. I'm
working on a PR for this I'll cc you when it's up.




-- 
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau


> PySpark decision tree params should include default values to match Scala
> -
>
> Key: SPARK-15130
> URL: https://issues.apache.org/jira/browse/SPARK-15130
> Project: Spark
>  Issue Type: Improvement
>  Components: Documentation, ML, PySpark
>Reporter: holdenk
>Priority: Minor
>
> As part of checking the documentation in SPARK-14813, PySpark decision tree 
> params do not include the default values (unlike the Scala ones). While the 
> existing Scala default values will have been used, this information is likely 
> worth exposing in the docs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-15130) PySpark decision tree params should include default values to match Scala

2016-05-04 Thread Xin Ren (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-15130?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15271531#comment-15271531
 ] 

Xin Ren commented on SPARK-15130:
-

Hi I just found in class DecisionTreeClassifier of pyspark, there is a 
setParams method which sort of matches what is in scala ones.

do you mean to create a separate class "Param"?

{code}
@keyword_only
@since("1.4.0")
def setParams(self, featuresCol="features", labelCol="label", 
predictionCol="prediction",
  probabilityCol="probability", 
rawPredictionCol="rawPrediction",
  maxDepth=5, maxBins=32, minInstancesPerNode=1, 
minInfoGain=0.0,
  maxMemoryInMB=256, cacheNodeIds=False, checkpointInterval=10,
  impurity="gini", seed=None):
"""
setParams(self, featuresCol="features", labelCol="label", 
predictionCol="prediction", \
  probabilityCol="probability", 
rawPredictionCol="rawPrediction", \
  maxDepth=5, maxBins=32, minInstancesPerNode=1, 
minInfoGain=0.0, \
  maxMemoryInMB=256, cacheNodeIds=False, checkpointInterval=10, 
impurity="gini", \
  seed=None)
Sets params for the DecisionTreeClassifier.
"""
kwargs = self.setParams._input_kwargs
return self._set(**kwargs)
{code}

> PySpark decision tree params should include default values to match Scala
> -
>
> Key: SPARK-15130
> URL: https://issues.apache.org/jira/browse/SPARK-15130
> Project: Spark
>  Issue Type: Improvement
>  Components: Documentation, ML, PySpark
>Reporter: holdenk
>Priority: Minor
>
> As part of checking the documentation in SPARK-14813, PySpark decision tree 
> params do not include the default values (unlike the Scala ones). While the 
> existing Scala default values will have been used, this information is likely 
> worth exposing in the docs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org