Keywords
decision trees, homogeneous forest, heterogeneous forest
Abstract
Using decision trees that split on randomly selected attributes is one way to increase the diversity within an ensemble of decision trees. Another approach increases diversity by combining multiple tree algorithms. The random forest approach has become popular because it is simple and yields good results with common datasets. We present a technique that combines heterogeneous tree algorithms and contrast it with homogeneous forest algorithms. Our results indicate that random forests do poorly when faced with irrelevant attributes, while our heterogeneous technique handles them robustly. Further, we show that large ensembles of random trees are more susceptible to diminishing returns than our technique. We are able to obtain better results across a large number of common datasets with a significantly smaller ensemble.
Original Publication Citation
Gashler, M., Giraud-Carrier, C., and Martinez, T. R., "Decision Tree Ensemble: Small Heterogeneous is Better than Large Homogeneous", Proceedings of ICMLA'8 (International Conference on Machine Learning Applications), 28.
BYU ScholarsArchive Citation
Gashler, Mike; Giraud-Carrier, Christophe G.; and Martinez, Tony R., "Decision Tree Ensemble: Small Heterogeneous Is Better Than Large Homogeneous" (2008). Faculty Publications. 902.
https://scholarsarchive.byu.edu/facpub/902
Document Type
Peer-Reviewed Article
Publication Date
2008-12-11
Permanent URL
http://hdl.lib.byu.edu/1877/2420
Publisher
IEEE
Language
English
College
Physical and Mathematical Sciences
Department
Computer Science
Copyright Status
© 2008 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.
Copyright Use Information
http://lib.byu.edu/about/copyright/