[ https://issues.apache.org/jira/browse/SPARK-15130?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15271531#comment-15271531 ]
Xin Ren commented on SPARK-15130: --------------------------------- Hi I just found in class DecisionTreeClassifier of pyspark, there is a setParams method which sort of matches what is in scala ones. do you mean to create a separate class "Param"? {code} @keyword_only @since("1.4.0") def setParams(self, featuresCol="features", labelCol="label", predictionCol="prediction", probabilityCol="probability", rawPredictionCol="rawPrediction", maxDepth=5, maxBins=32, minInstancesPerNode=1, minInfoGain=0.0, maxMemoryInMB=256, cacheNodeIds=False, checkpointInterval=10, impurity="gini", seed=None): """ setParams(self, featuresCol="features", labelCol="label", predictionCol="prediction", \ probabilityCol="probability", rawPredictionCol="rawPrediction", \ maxDepth=5, maxBins=32, minInstancesPerNode=1, minInfoGain=0.0, \ maxMemoryInMB=256, cacheNodeIds=False, checkpointInterval=10, impurity="gini", \ seed=None) Sets params for the DecisionTreeClassifier. """ kwargs = self.setParams._input_kwargs return self._set(**kwargs) {code} > PySpark decision tree params should include default values to match Scala > ------------------------------------------------------------------------- > > Key: SPARK-15130 > URL: https://issues.apache.org/jira/browse/SPARK-15130 > Project: Spark > Issue Type: Improvement > Components: Documentation, ML, PySpark > Reporter: holdenk > Priority: Minor > > As part of checking the documentation in SPARK-14813, PySpark decision tree > params do not include the default values (unlike the Scala ones). While the > existing Scala default values will have been used, this information is likely > worth exposing in the docs. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org