2013/7/7 Ian Ozsvald <[email protected]>:
> Following on from the previous post, I thought (from reading only and
> accepting no prior experience with AdaBoost) that the main goal of
> AdaBoost was to combine weak classifiers (e.g. a depth-restricted
> DecisionTree) rather than building an ensemble of strong classifiers
> (as in e.g. a RandomForest).
>
> The example on the site:
> http://scikit-learn.org/dev/auto_examples/ensemble/plot_forest_iris.html
> uses DecisionTrees with max_depth=None for each of the 4 classifiers.
> Using a depth restricted classifier (e.g. max_depth=3) for AdaBoost
> results in the same classification quality in this example.
>
> Might the example say more about AdaBoost's ability to use weak
> classifiers if we used a restricted depth DecisionTree?

+1, PR accepted :)

Boosting is good for ensembling a large number of underfitting models
and thus correcting their individual bias.
Bagging and other randomized voting aggregates is good for ensembling
a large number of overfitting models and thus correcting their
individual variance.

--
Olivier
http://twitter.com/ogrisel - http://github.com/ogrisel

------------------------------------------------------------------------------
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to