Re: Did anybody come across this random-forest issue with spark 2.0.1.

2016-10-18 Thread 市场部
>, "user@spark.apache.org<mailto:user@spark.apache.org>" <user@spark.apache.org<mailto:user@spark.apache.org>> 主题: Re: Did anybody come across this random-forest issue with spark 2.0.1. ​Please increase the value of "maxMemoryInMB"​ of your RandomForestClassifier or

Re: Did anybody come across this random-forest issue with spark 2.0.1.

2016-10-17 Thread Yanbo Liang
MemoryUsage=268435456. This allows splitting 80082 nodes in this > iteration. > > > > 发件人: zhangjianxin <zhangjian...@didichuxing.com> > 日期: 2016年10月17日 星期一 下午8:16 > 至: Xi Shen <davidshe...@gmail.com> > 抄送: "user@spark.apache.org" <user@spark.apache.org>

Re: Did anybody come across this random-forest issue with spark 2.0.1.

2016-10-17 Thread 市场部
...@gmail.com>> 日期: 2016年10月17日 星期一 下午8:00 至: zhangjianxin <zhangjian...@didichuxing.com<mailto:zhangjian...@didichuxing.com>>, "user@spark.apache.org<mailto:user@spark.apache.org>" <user@spark.apache.org<mailto:user@spark.apache.org>> 主题: Re: Did an

Re: Did anybody come across this random-forest issue with spark 2.0.1.

2016-10-17 Thread 市场部
t;mailto:zhangjian...@didichuxing.com>>, "user@spark.apache.org<mailto:user@spark.apache.org>" <user@spark.apache.org<mailto:user@spark.apache.org>> 主题: Re: Did anybody come across this random-forest issue with spark 2.0.1. Did you also upgrade to Java from v7 to v8?