week and I got these links from Andy's response
(thanks Andy!)
-Jason
From: Pagliari, Roberto [mailto:rpagli...@appcomsci.com]
Sent: Tuesday, April 14, 2015 3:08 PM
To: scikit-learn-general@lists.sourceforge.net
Subject: Re: [Scikit-learn-general] adaboost parameters
hi Jason/Andreas,
ed in the video. I don't
know other tips or rule of thumbs are available.
Thanks,
From: Jason Wolosonovich [jmwol...@asu.edu]
Sent: Monday, April 13, 2015 10:47 PM
To: scikit-learn-general@lists.sourceforge.net
Subject: Re: [Scikit-learn-general] adaboost
oject.
-Jason
From: Andreas Mueller [mailto:t3k...@gmail.com]
Sent: Monday, April 13, 2015 3:31 PM
To: scikit-learn-general@lists.sourceforge.net
Subject: Re: [Scikit-learn-general] adaboost parameters
You might consider using gradient boosting instead.
see https://www.youtube.com/watch?v=IXZKgIsZRm0
:* Re: [Scikit-learn-general] adaboost parameters
What is your dataset like? How are you building your individual
classifier that you are ensembling with AdaBoost? A common-use case
would be boosted decision stumps (one-level decision trees).
http://en.wikipedia.org/wiki/Decision_stump
http
ks,
From: Jason Wolosonovich [mailto:jmwol...@asu.edu]
Sent: Saturday, April 11, 2015 9:13 AM
To: scikit-learn-general@lists.sourceforge.net
Subject: Re: [Scikit-learn-general] adaboost parameters
What is your dataset like? How are you building your individual classifier that
you are ensem
i, Roberto [mailto:rpagli...@appcomsci.com]
Sent: Friday, April 10, 2015 1:18 PM
To: scikit-learn-general@lists.sourceforge.net
Subject: [Scikit-learn-general] adaboost parameters
When using adaboost, what is a range of values of n_estimators and learning
rate that makes sense to optimize over?
When using adaboost, what is a range of values of n_estimators and learning
rate that makes sense to optimize over?
Thank you,
--
BPM Camp - Free Virtual Workshop May 6th at 10am PDT/1PM EDT
Develop your own process in a