Keywords

decision trees, homogeneous forest, heterogeneous forest

Abstract

Using decision trees that split on randomly selected attributes is one way to increase the diversity within an ensemble of decision trees. Another approach increases diversity by combining multiple tree algorithms. The random forest approach has become popular because it is simple and yields good results with common datasets. We present a technique that combines heterogeneous tree algorithms and contrast it with homogeneous forest algorithms. Our results indicate that random forests do poorly when faced with irrelevant attributes, while our heterogeneous technique handles them robustly. Further, we show that large ensembles of random trees are more susceptible to diminishing returns than our technique. We are able to obtain better results across a large number of common datasets with a significantly smaller ensemble.

Original Publication Citation

Gashler, M., Giraud-Carrier, C., and Martinez, T. R., "Decision Tree Ensemble: Small Heterogeneous is Better than Large Homogeneous", Proceedings of ICMLA'8 (International Conference on Machine Learning Applications), 28.

Document Type

Peer-Reviewed Article

Publication Date

2008-12-11

Permanent URL

http://hdl.lib.byu.edu/1877/2420

Publisher

IEEE

Language

English

College

Physical and Mathematical Sciences

Department

Computer Science

Share

COinS