[ https://issues.apache.org/jira/browse/SPARK-13724?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15183521#comment-15183521 ]
senthil gandhi commented on SPARK-13724: ---------------------------------------- This is what I am trying to understand (I don't use Scala or Java just Python). So If I am using this API from Python, and I get a message asking to increase maxMemoryInMB, there is currently no way to do it, is it? > Parameter maxMemoryInMB has gone missing in MlLib 1.6.0 > DecisionTree.trainClassifier() > -------------------------------------------------------------------------------------- > > Key: SPARK-13724 > URL: https://issues.apache.org/jira/browse/SPARK-13724 > Project: Spark > Issue Type: Bug > Components: PySpark > Affects Versions: 1.6.0 > Reporter: senthil gandhi > > DecisionTree.trainClassifier() reports that maxMemoryInMB is too small during > training and stops. But when I try to set it, I found that in MLlib of spark > 1.6.0 pyspark.mllib.tree.DecisionTree doesn't have this parameter in the > named parameter list anymore. > (Also not sure if this is the place for this issue, kindly educate!) -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org